How to interpret those results ?

User 2487 | 11/2/2015, 2:05:39 PM

Data: +--------------+-----------------+-------+ | targetlabel | predictedlabel | count | +--------------+-----------------+-------+ | 1 | 1 | 150 | | 0 | 0 | 151 | +--------------+-----------------+-------+ [2 rows x 3 columns] , 'accuracy': 1.0}


Comments

User 1592 | 11/2/2015, 4:33:15 PM

Hi The result means that in 150 cases the classifier predicted the lebel "1" when the true label was "1" and in the rest of the 151 cases the classifier predicted the label "0" when the true label was "0". In total you got an accuracy of 1, which is 100% correct.

If there were classification mistakes you could have seen also the cases of target label 0 and predicted label 1 and vice versa.