카테고리 없음
tf logits
sngjuk
2017. 12. 6. 11:11
Logits simply means that the function operates on the unscaled output of earlier layers and that the relative scale to understand the units is linear. It means, in particular, the sum of the inputs may not equal 1, that the values are not probabilities (you might have an input of 5).
logits - [softmax] -> sum(logits) becomes 1
The cross entropy is a summary metric - it sums across the elements. The output of tf.nn.softmax_cross_entropy_with_logits
on a shape [2,5]
tensor is of shape [2,1]
(the first dimension is treated as the batch).