For more than a decade, Bambuser’s mobile video streaming service has been capable of streaming ultra-low latency video from mobile devices to massive amounts of viewers. We have always embraced low latency, which enables meaningful interaction between broadcasters and viewers. If viewers fall far behind the broadcaster, any interaction becomes difficult and unnatural.
As we now launch our implementation of Low-latency HLS (LHLS) we go into the future with an even more scalable, robust and cost-efficient streaming solution.
The challenges of live streaming
Live streaming has historically been a market with many competing formats. Some streaming formats have been proprietary, some have been platform specific or have required browser plugins (which are on on their way out) and some require server software from a specific vendor.
The first challenge in live stream distribution is that different devices have required different streaming formats. In order to reach viewers on many mobile platforms and web browsers, live streams have to be transcoded, in real-time, to different streaming formats.
Secondly, many video streaming standards have not included support for offering multiple video qualities to viewers. A video has a certain quality level, which is basically a result of the chosen video resolution and bitrate. If every viewer is served the same quality, viewers on slow connections may not be able to keep up, constantly seeing interruptions, buffering and possibly increasing latency, while viewers on very fast connections may see underwhelming video quality. To make all viewers happy, a live video needs to be made available in multiple qualities, and viewers need to be served the quality most appropriate for their connection. HLS is one of the few successful formats which offers this.
When video has to be output in multiple formats, each with multiple quality levels, a lot of server processing time is required. Distribution of these multiple formats is further complicated by the fact that many of them don’t work over readily available content delivery networks (CDNs) built with the enormous scale of the Internet in mind, instead requiring that custom video servers are launched around the world, to meet demand when a live event goes viral. It’s of course possible to launch any number of servers, but orchestrating an armada of servers comes with practical complexity issues, and consequently a cost.
Many streaming formats also require a persistent connection from viewers to a server for the whole lifetime of the view, which means that a server can’t be updated or terminated at any time without interrupting viewers, but have to be kept alive until all viewers have left voluntarily.
HLS solves some problems
HLS, short for HTTP Live Streaming, relies on the simplicity and strengths of the infrastructure in place to serve web pages.
Web pages are served by web servers which use HTTP. A client asks for a page, gets a limited amount of data from a server in short amount of time, and the server is then free to move on. Often, the data is identical for all clients. This makes it trivial to scale up. As servers are added, they can be placed around the world, close to where clients are, and the data can be cached locally, so whenever new clients arrive, the data will be readily available on a server nearby.
In HLS, a live or on-demand video is split up into segments with a well-defined duration, resulting in short video files which are a perfect match for existing HTTP CDNs. Scaling is suddenly a solved problem, and thanks to the ubiquity of CDNs, this is the cheapest video delivery approach.
Viewers basically fetch a playlist where all segments are listed, then fetch segments as needed, and are not dependent on a long-lived connection to a specific instance of a custom streaming server. The HLS playlists can contain multiple qualities of each segment, so a player supporting HLS can choose the quality for each segment depending on current network conditions.
Thanks to these properties, HLS has largely established itself as the most well-supported format for streaming video over the Internet, both live and on demand. Today, HLS can be played both on mobile platforms and in all modern web-browsers using a few clever conversion tricks.
The reason Bambuser avoids plain HLS for live video today is latency. High latency is an unfortunate consequence of how the HLS standard works. Standard HLS players assume that a segment listed in the playlist is complete and can be downloaded faster than real-time, as the player then uses the download time to choose an appropriate quality level. Due to this, plain HLS doesn’t support exposing a segment which is being produced in real-time. When a 6 second segment has to be produced before being published in the playlist, the beginning of the segment will be at least 6 seconds old when players discovers it. To avoid interruptions and buffering, standard HLS players commonly buffer a few segments, easily causing a total latency of 20 seconds on a live stream. If viewers fall that far behind, interaction is no longer practical.
The latency-problem in HLS can be alleviated slightly by shrinking the segment size from recommended values, but anything below a few seconds becomes impractical due to decreased video compression efficiency and an increasing number of segments to list and fetch, as each segment involves HTTP overhead. In the end, this still leaves the latency too high for us.
For these reasons, Bambuser has so far used various low-latency streaming formats and a custom CDN for live video, while using HLS as a fallback mainly for on-demand video. This has allowed us to offer both low latency and excellent scaling, with the trade-off being unable to use the cheapest commodity CDNs. This has served us well, but we are now ready to take the next step.
Low-latency HLS to the rescue!
Our solution is to extend HLS with support for playing segments which are being produced in real-time, and to provide video players which have full support for this extension of the standard. We are now offering Low-latency HLS, which maintains all the scalability and cost benefits of HLS while providing the low latency which Bambuser is known for.
Today, our Low-latency HLS playlists contain information for our player SDKs on where to find upcoming HLS segments, so a player can connect to them early and stream the segments as they are being produced, in real-time. The HLS playlists remain fully compatible with standard HLS players, so the extension brings no additional processing cost.
Customers are able to adopt LHLS simply by integrating the updated versions of our player SDKs, no additional configuration required. This lets us offer very competitive prices for customers that scale to very large audiences.