Top1 accuracy

In this article, we have explained the idea behind Top1 accuracy and how to calculate it.

Table of contents:

  1. Introduction to Top1 accuracy
  2. How is top1 accuracy calculated?
  3. Code to calculate Top1 accuracy
  4. Top5 accuracy vs Top1 accuracy
  5. Concluding Note

Pre-requisites:

Introduction to Top1 accuracy

Top1 accuracy is a measure to denote the correctness of the output of a Machine Learning model. Top1 accuracy is frequently used with Image Recognition, Object Detection and much more.

Top5 accuracy is often, used along with Top1 accuracy as a set.

(Top1 accuracy, Top5 accuracy)

Example values:

(0.723, 0.902)

How is top1 accuracy calculated?

A Machine Learning like ResNet50 has Softmax operation as the last operation. These models are trained on a dataset that has 1000 categories like ImageNet.

So, the output of Softmax operation or a ML model is an array of size 1000. Each element is the probability that the input data belongs to the current category.

Additionally, the sum of all elements in Softmax output is equal to 1.

So, if the output is out[0][1000], then:

  • out[0][i] = probability that the input belongs to i-th category
  • Sum of out[0][i] for all i = 1

A sample output of Softmax is as follows:

[ 0.0003, 1.20007e-10, 4.0025e-9, ..., 0.0001, 7.011e-4]

To get the accuracy, we know that the input belongs to category j. To calculate Top1 accuracy, the steps are as follows:

  • Find the largest element in the output of Softmax. Let it be out[k]
  • If j matches k (k in the category number), then it is a match and 1 is added to the total count of matches.
  • If j does not match, then the output is wrong and this is considered as 0 accuracy.

As the test is done for multiple images, the count of matches is maintained and divided for the total number of images to get the average Top5 accuracy.

So, if there were 500 images and out of it, 339 images were assigned to the correct category based on Top5 values, then the average Top5 accuracy will be:

= 339/500
= 0.678

This is the approach to calculate Top1 accuracy.

Code to calculate Top1 accuracy

Top5 accuracy is usually calculated in TensorFlow Python using functions like:

  • tf.nn.in_top_k: To get match of a target in topK elements of predictions tensor
  • tf.reduce_sum: To get the sum of all elements in a tensor (as we are adding up Top5 accuracy for multiple images and getting average later)
accuracy1 = tf.reduce_sum(
          input_tensor=tf.cast(tf.nn.in_top_k(predictions=tf.constant(predictions),
              targets=tf.constant(np_labels), k=1), tf.float32))

np_accuracy1 =  sess.run([accuracy1])
      total_accuracy1 += np_accuracy1
      print("Processed %d images. (Top1 accuracy) = (%0.4f)" \
          % (num_processed_images,
          total_accuracy5/num_processed_images))

Top5 accuracy vs Top1 accuracy

Top5 vs Top1 accuracy
PointTop5Top1
Higher valueHighLow
# of predictions
considered
51
Strict checkLess strictMore strict

Concluding Note

With this article at OpenGenus, you must have the complete idea of Top1 accuracy and how to calculate it.