What is logits, softmax and softmax_cross_entropy_with_logits ?

What is logits, softmax and softmax_cross_entropy_with_logits ?

Asked on November 15, 2018 in Tensorflow.
Add Comment


  • 1 Answer(s)

        Assume you have two tensors, where y_hat contains registered scores for each class (for instance, from y = W*x +b) and y_true contains one-hot encoded true labels.

    y_hat = ... # Predicted label, e.g. y = tf.matmul(X, W) + b
    y_true = ... # True label, one-hot encoded
    

        In the event that you translate the scores in y_hat as unnormalized log probabilities, at that point they are logits.

        Also, the total cross-entropy loss processed in this manner:

    y_hat_softmax = tf.nn.softmax(y_hat)
    total_loss = tf.reduce_mean(-tf.reduce_sum(y_true * tf.log(y_hat_softmax), [1]))
    

        Is basically comparable to the total cross-entropy loss processed with the function softmax_cross_entropy_with_logits():

    total_loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(y_hat, y_true))
    
    Answered on November 15, 2018.
    Add Comment


  • Your Answer

    By posting your answer, you agree to the privacy policy and terms of service.