What is logits, softmax and softmax_cross_entropy_with_logits?

646    Asked by maria_2935 in Data Science , Asked on Jun 14, 2021

 In tensorflow API documentation here one keyword was there logits. What is logits? In most of the documents I went through it was written as:

tf.nn.softmax(logits, name=None)

If logits are only Tensors then why they have different name?

Lastly, I would like to know the difference between two methods:

tf.nn.softmax(logits, name=None)
tf.nn.softmax_cross_entropy_with_logits(logits, labels, name=None)

I hope this answer will help you.

To solve softmax_cross_entropy_with_logits you should use the following:

Logits is a function which operates on the unscaled output of earlier layers and on a linear scale to understand the linear units. In Mathematics, Logits is a function that maps probabilities ( [0, 1] ) to R ( (-inf, inf) ) .

 tf.nn.softmax gives only the result of applying the softmax function to an input tensor. The softmax "squishes" the inputs so that sum(input) = 1,it is a simple way of normalizing. Moreover, the shape of output is the same as that of input, it just normalizes the values.It is used during the evaluation of the model when you compute the probabilities that the model outputs.

a = tf.constant(np.array([[.1, .3, .5, .9]]))

 print s.run(tf.nn.softmax(a))

[[ 0.16838508 0.205666 0.25120102 0.37474789]

tf.nn.softmax_cross_entropy_with_logits is mainly used for computing the cross entropy of the result after the softmax function has been applied. It is only used during training.Its result is similar to-

sf = tf.nn.softmax(x)

 c = cross_entropy(sf)

In case, you want to decrease the cross-entropy and you are softmaxing after your last layer then you should use tf.nn.softmax_cross_entropy_with_logits as it covers numerically unstable corner cases.



Your Answer

Interviews

Parent Categories