Keras softmax output and accuracy
Rise to the top 3% as a developer or hire one of them at Toptal: https://topt.al/25cXVn
--------------------------------------------------
Music by Eric Matyas
https://www.soundimage.org
Track title: Horror Game Menu Looping
--
Chapters
00:00 Keras Softmax Output And Accuracy
01:00 Accepted Answer Score 6
02:21 Thank you
--
Full question
https://stackoverflow.com/questions/6309...
--
Content licensed under CC BY-SA
https://meta.stackexchange.com/help/lice...
--
Tags
#python #tensorflow #keras #neuralnetwork
#avk47
ACCEPTED ANSWER
Score 6
e.g., when the true class is [0, 0, 1] and predicted probability is [0.1, 0.4, 0.5], even if 0.5 is the largest probability, the accuracy of this prediction should be 0, because 0.5 != 1. Is that correct?
No. You treat the index with the maximum value as the prediction of the model. So in your example, this sample prediction would count towards increasing the accuracy. This is normally called Top-1 accuracy. In image classification, the Top-5 accuracy is also often used (the top 5 maximum values in the softmax layer are treated as guesses of the NN and they are considered for the accuracy).
More generally, when the output layer activation is softmax, we will normally get floating probability predictions, and in very very little chance will we get integer probability predictions like [0, 0, 1]. So we can't use accuracy as a metric when using softmax as activation. Is that correct?
Technically speaking, you will never get integer values for the softmax layer output since the type is float. But yeah, there's a very teeny tiny chance of getting [0.0, 0.0, 1.0]. And this assumption of yours is incorrect since the premise does not hold. Nevertheless, accuracy is a valid metric when using Softmax as the classification layer of a neural network.