# Top5 accuracy

Get FREE domain for 1st year and build your brand new site

In this article, we have explained the idea behind Top5 accuracy and how to calculate it.

**Table of contents**:

- Introduction to Top5 accuracy
- How is top5 accuracy calculated?
- Code to calculate Top5 accuracy
- Top5 accuracy vs Top1 accuracy
- Concluding Note

Pre-requisites:

## Introduction to Top5 accuracy

Top5 accuracy is a measure to denote the correctness of the output of a Machine Learning model. Top5 accuracy is frequently used with Image Recognition, Object Detection and much more.

Top5 accuracy is often, used along with Top1 accuracy as a set.

```
(Top1 accuracy, Top5 accuracy)
```

Example values:

```
(0.723, 0.902)
```

## How is top5 accuracy calculated?

A Machine Learning like ResNet50 has Softmax operation as the last operation. These models are trained on a dataset that has 1000 categories like ImageNet.

So, the output of Softmax operation or a ML model is an array of size 1000. Each element is the probability that the input data belongs to the current category.

Additionally, the sum of all elements in Softmax output is equal to 1.

So, if the output is out[0][1000], then:

- out[0][i] = probability that the input belongs to i-th category
- Sum of out[0][i] for all i = 1

A sample output of Softmax is as follows:

```
[ 0.0003, 1.20007e-10, 4.0025e-9, ..., 0.0001, 7.011e-4]
```

To get the accuracy, we know that the input belongs to category j. To calculate Top5 accuracy, the steps are as follows:

- Sort the output of Softmax in decreasing order
- Consider the largest 5 elements / first 5 elements in sorted output that is out[0], out[1], out[2], out[3] and out[4].
- If j matches one of the above 5 elements, then it is a match and 1 is added to the total count of matches.
- If j does not match, then the output is wrong and this is considered as 0 accuracy.

As the test is done for multiple images, the count of matches is maintained and divided for the total number of images to get the average Top5 accuracy.

So, if there were 500 images and out of it, 370 images were assigned to the correct category based on Top5 values, then the average Top5 accuracy will be:

= 370/500

= 0.74

This is the approach to calculate Top5 accuracy.

## Code to calculate Top5 accuracy

Top5 accuracy is usually calculated in TensorFlow Python using functions like:

- tf.nn.in_top_k: To get match of a target in topK elements of predictions tensor
- tf.reduce_sum: To get the sum of all elements in a tensor (as we are adding up Top5 accuracy for multiple images and getting average later)

```
accuracy5 = tf.reduce_sum(
input_tensor=tf.cast(tf.nn.in_top_k(predictions=tf.constant(predictions),
targets=tf.constant(np_labels), k=5), tf.float32))
np_accuracy5 = sess.run([accuracy5])
total_accuracy5 += np_accuracy5
print("Processed %d images. (Top5 accuracy) = (%0.4f)" \
% (num_processed_images, total_accuracy5/num_processed_images))
```

## Top5 accuracy vs Top1 accuracy

Top5 vs Top1 accuracy | ||
---|---|---|

Point | Top5 | Top1 |

Higher value | High | Low |

# of predictions considered | 5 | 1 |

Strict check | Less strict | More strict |

## Concluding Note

With this article at OpenGenus, you must have the complete idea of Top5 accuracy and how to calculate it.