Posted on Leave a comment

Operators need low latency, but ultra-low, meh!

Except maybe during the world cup's four-week duration, once every four years.

This opinion piece was first published here on TVB Europe.

Image credit www.thestreamingcompany.com

Zoom forward to the beginning of 2023, and Ultra Low Latency (ULL) is considered a must-have by all and sundry. What has changed?

Well, one thing is five solid years of aggressive vendor marketing with ULL solutions.

This ongoing onslaught reminded me of over 15 years ago when Microsoft was still in the game. Vast bouquets of linear TV channels were still critical to the TV business, and getting to the desired channel could be challenging for viewers. The firm had invented a fast-channel change solution that enabled zapping from channel to channel well under a second. Microsoft's marketing machine convinced the industry that it was a vital need.

Rewind to when remote controls first appeared. It was an analogue world. As long as radio reception was stable, channel-change times were instantaneous. When digital TV appeared at the end of the last century, latency crept in between pushing the P+ button and the channel changing on the set. This delay was due to the need to decode the new channel's compressed digital stream. After some early hits & misses – I remember one of the world's first IPTV deployments in the early noughties with an 8-second zapping time - MPEG2's average two-second GOP time created the standard for this. For over a decade, when you pressed P+ or P-, something would happen within about 2 seconds, and that was fine. Sure, it's always nice when things happen even faster. Users noticed it and perhaps would have churned away from services with 8s zapping time, but nobody cared about the difference between one second and two seconds.

What we wrongly call latency today – in reality, it is delay - also has a gold standard, set by decades of broadcast at around five seconds. So, for the bulk of use cases, i.e. watching live TV, when you are close to those five seconds, reducing is nice, but, say, even ten seconds will not be a deal breaker for most viewers. Sports fans have long known they can get a score a few seconds earlier on the radio, and we've always lived with that. When someone on the ground tweets a live score, that too has a few seconds of delay, but it seems real-time enough to the community bounded by the 5s delay of the main live video feed.

There's also an elephant in the low latency room … have you guessed? ... sustainability. Lower latency means more and faster caches. Client-side video buffering may be considered undesirable from a user perspective. Still, it has proven to be the best way to ensure robust delivery without deploying significant resources in the network. That was ABR's secret sauce that enabled the internet's video revolution in the first place. As with all aspects of our lives that generate carbon emissions, we must also ask ourselves what is good enough.

Of course, some exceptional video use cases will require ULL, such as betting or professional uses like … yes, you've heard it before … telemedicine. Still, we're talking of live TV for the most part, and here, it's only once every four years that the world cup is broadcast simultaneously over different platforms, and people care about those few seconds. The rights of most other major sports events that attract substantial live viewership are owned by a single broadcaster. The use case when part of the family is watching a DTT broadcast while others are streaming the same content in the same house is too marginal to consider. 30+ seconds of delay is becoming untenable for live sports, but mainstream video consumption doesn't have to go lower once we're in the single-digit delays. I doubt users would flock from 7-second delay services to a 5-second delay one. Note that this opinion piece is just about the video use case; in other situations, ULL will enable new services that would otherwise have been impossible; just ask any 5G equipment vendor ;o)