×

Search anything:

Live streaming to 25.3M concurrent viewers: Deal with traffic spike

Internship at OpenGenus

Get this book -> Problems on Array: For Interviews and Competitive Programming

In 2019, Hotstar created a global record for live streaming to 25.3 million concurrent viewers. What all happened?

During the ICC 2019 World Cup Semi-Final between India and New Zealand, Hotstar set a new world record for live streaming to 25.3 million concurrent viewers. That's more than any other platform has ever managed before!

How did Hotstar achieve this? And what things happened during this process? Let's take a look.

What happened?

On day 2 of the match, the first spike witnessed was from 1.5M to 15M, as India started batting. And then Dhoni came in to bat, and there was another sudden spike in traffic, taking it to 25.3M concurrent users. But then Dhoni got out, and suddenly there was a drastic viewer drop from 25.3M to <1M users.

The challenges

The first challenge was handling 25.3M, concurrent users. The second was when viewers decided to drop out; they either exited from the app entirely or returned to their homepage to explore other content, causing a sudden increase in load on the corresponding homepage services.

How does Hotstar tackle this?

Hotstar maintains an in-project called "Project Hulk" for load testing and generations. It tests the system's resilience and helps them find its breaking points. It allows them to mimic the entire user journey with different inputs and can also simulate entire traffic patterns like this match.

Hotstar does not use a traditional autoscaling from AWS because it comes with challenges, such as insufficient capacity errors, which can't be handled during live events. Also, the limited step size makes it slow to scale during live events.

So they have built their scaling strategy. It allows them to pre-warm their infrastructure before high concurrency events like live matches. Also, it allows automated proactive scale-up with a buffer to handle a sudden spike in traffic. It also allows secondary autoscaling groups to act as backups and gets utilized in case the primary fails to scale.

Also, to manage high concurrency events like this, they turn off their non-critical services such as recommendations, personalizations, chat, and emojis services to decrease the load on backend servers. They also follow graceful degradation to resolve errors without impacting actual customers. It helped them to stay afloat during the traffic peaks.

Hotstar is a great example of how to deal with sudden traffic spikes. It shows the significance of being prepared beforehand and building a good infrastructure so your website can handle any traffic load.

Benjamin QoChuk, PhD

Benjamin QoChuk, PhD

Benjamin QoChuk is a Computer Science Researcher, Inventor and Author. His educational background include Bachelors at Vanderbilt University and PhD in Computer Science at Peking University.

Read More

Improved & Reviewed by:


OpenGenus Foundation OpenGenus Foundation
Live streaming to 25.3M concurrent viewers: Deal with traffic spike
Share this