Every year, a very technical term crops up in the coverage of major sporting events that has few friends and is considered the number one mood killer for any live audience: latency.
This refers to the delays affecting the different TV reception channels, which do not all receive their signals in parallel. Depending on different transmission channels – such as antenna (DVB-T2), digital cable or streaming services on the internet – various technical factors come into play that cause disparity in goal celebrations. The term “live transmission” is, therefore, open to interpretation.
Until recently, the internet – with its streaming services, apps and pay-per-view offers – was considered the slowest channel; when compared to satellite signals, there was talk of delays of up to several minutes. At the time, Heise reported on the 2014 World Cup: “Those who use internet services to stay on the ball usually still have plenty of time to pull out their smartphone, start the app and then watch the goal “live” that the others have already cheered for." (german article)
But: online is growing and television is increasingly shifting to the internet. The technology is constantly being developed and the minimising of latency is not just our day-to-day business as a streaming service provider, but rather a genuine passion.We are delighted when our know-how is successfully used in one or the other live offering for major sporting events.
We will be happy to provide concrete examples and references on request. Initial measurements on the occasion of sporting events in the summer of 2018 showed latency optimisations from what was previously 60 seconds, to 5 seconds when compared to a satellite signal (DVB-S2).
To optimise the latencies, we essentially focused on shortening the segments. For HTTP live streaming, the media content is divided into small individual videos of a certain length (segmented). These segments are delivered via HTTP and played back on the user’s end device (player) as an uninterrupted stream.
The delivery of a live stream usually also depends on the current connection conditions of the end user, i.e. the optimal quality (highest possible bit rate) is provided. The image quality of the streams adapts to the reception conditions of the end user. If the end device is connected via WiFi, the sharpest images with the highest quality are provided. If the user is sitting in a train, for example, with low reception quality, the data rate is throttled and the quality is reduced.
The spread of HTTP streams with short segment lengths, however, brings with it a massive increase in requests to the delivering servers – and requires a high-performance platform to ensure a smooth operation, which the colleagues from Akamai provide us with thanks to their "Media Services Live 4".
On the subject of latency, here are some exciting resources from our technology partners: