View All Posts
View All Posts

Ultimate Guide to Low Latency Streaming

November 30, 2023
11-Minute Read
Senior Manager, Encoding

The video streaming industry is expected to grow substantially, projected to reach $420 billion by 2028. Success for streaming services depends on important factors. Low-latency streams are crucial for good viewer experiences and will likely stay important as streaming becomes more common on popular platforms.

In this guide to low-latency streaming, we’ll cover the following:

  • The definitions of latency and low latency
  • The business case for low latency streaming
  • QoE optimization in low latency DAI workflowsLatency in video streaming

The term 'low latency' has become standardized within the streaming industry. In this context, latency refers to the interval between a live event's occurrence and its display on the viewer's screen.

What is low latency?

In video streaming, low latency describes the subjectively quick time frame from which chunks of data are sent glass-to-glass between the source and the subscriber’s screen.

Another way of defining low latency streaming is by calling it “low delay streaming,” but what timeframes are considered low or high?

The characteristics of live streaming latency are often compared to broadcast. Traditionally, broadcast video has delays of around five seconds. You will barely see this delay when watching an event broadcast on your TV or when listening to the commentary on an analog radio. Is it safe to say that since we rarely hear consumers complain about this 5-second delay, it's perhaps a generally acceptable delay? Possibly.

What about a 6, 10 or 15-second delay? To understand this, let’s dive into the protocols. 

Low-Latency streaming protocols

The advent of low-latency streaming protocols marks a significant milestone in the evolution of live-streaming technology. These protocols are specifically designed to reduce the time lag inherent in traditional streaming, thereby enabling near-real-time content delivery. We delve into some prominent low-latency protocols reshaping the streaming landscape.

RTMP

Initially developed by Macromedia and later owned by Adobe, RTMP was designed for high-performance audio, video and data transmission between a server and a Flash player. Although Flash Player's sunset has led to a decline in RTMP's prevalence for playback, it remains widely used for stream ingestion due to its low latency capabilities.

SRT

An open-source transport protocol, SRT delivers high-quality, secure, low-latency video across noisy networks. With its ability to recover from packet loss, ensure content security with AES encryption and reduce latency, SRT is gaining traction as a reliable choice for broadcasting in unpredictable network conditions. SRT is progressively replacing RTMP for live stream ingest into cloud delivery platforms. 

Low-latency HLS

Apple introduced an extension of the traditional HLS protocol, low-latency HLS to reduce latency. It is based on the CMAF segment format. It leverages shorter segment durations and pre-loading hints to allow quicker video delivery while taking advantage of HLS's extensive device support and adaptive streaming capabilities.

Low-latency DASH

Low-latency DASH (Dynamic adaptive streaming over HTTP) represents a collaborative effort to reduce latency in the DASH protocol. It is based on the CMAF segment format. By using chunked transfer encoding and allowing for segments to be delivered and processed on the player before the entire file is available. Low-latency DASH can significantly decrease the delay in content delivery.

Latency is dependent on the type of application. Expected latency is not the same for a live talk show channel versus a sports event versus a gambling channel versus auctions. 

The business case for low latency video streaming

There are new ways to approach the stream delay issue and video technology is evolving with CMAF offering a toolbox used in both, low-latency HLS and low-latency DASH. Today, video streaming capabilities have become reliable enough to carry commercial services such as Netflix, but the latency issue is still a concern, especially for live events.

Let’s look at how and where latency plays a role in the quality of experience (QoE).

How latency affects the video experience

Let’s first look at how some different types of content may be affected by latency.

  • Non-linear content is only impacted by latency in that this could, in some cases, cause initial buffering before a stream starts. But even if the startup time is an important ingredient in the global QoE, it should not be mixed up with the stream delivery end-to-end latency.

  • Re-linearized content is similarly impacted by latency, which can also be akin to channel-change time. Here again, what impacts the user experience during zapping is more related to start-up time rather than Latency.

  • Interactive shows have traditionally not required low latency, for example, with the Eurovision Song Contest that uses telephone, SMS-based voting. However, a new genre of truly interactive shows, where participants interact in real-time based on a stream, will require a latency significantly below the usual latency experienced in traditional broadcast (5 seconds).

  • Niche live sports will typically only be streamed on a single platform and, therefore, not broadcast simultaneously on different ones. One could say that they are not affected by latency, but the ubiquitous use of social media changes the game as no one wants to be spoiled by a friend sending a message on an important event in the game.

  • The Olympic Games are a unique case where latency does become an issue. Indeed, in most cases, the national team’s matches are carried by a public broadcaster that’s freely available on all platforms.

  • Betting is a unique use case where low latency is preferable, but the requirement is for a fully controlled and consistent latency.

Inherently real-time video applications like telemedicine, flying a drone or using a massive screen at a live event are niche applications that require latency in milliseconds, not seconds and are beyond the scope of this post.

Do you need low latency? Latency requirements depend on the use case

Any operator offering the above use cases will eagerly embrace lower delay times for video streaming. However, the business case to spend significant money to reduce delay will depend on the use case. 
However, the one factor that applies to all use cases is social media. Indeed, posting a tweet from a live event has a similar five-second delay to a linear broadcast. That, therefore, becomes a key target for live streaming.

Although there are different technical approaches to reducing stream delay, most newer solutions use CMAF. Explaining these other techniques requires a deep dive into packaging, chunks origin servers, video players and CDNs. 

Commercial products have been available for years and media processing and delivery providers have made their case through growing marketing efforts. Some operators, like the BBC, have been vocal about the issue. Still, operators are learning more about the business growth potential gained from lowering delay.

Beyond solving delay issues, expectations on video quality add a layer to broadcasters’ business challenges.

Second-screen experiences

The proliferation of second-screen usage has become vital to the contemporary viewing experience. With an increasing number of viewers simultaneously engaging with additional digital devices while watching television, synchronization between the primary content and the supplementary device is paramount. This synchronization is particularly critical during live events like sports, where play-by-play action and social media commentary should occur without perceptible delay. Low latency is essential in these scenarios to ensure a cohesive and engaging second-screen user experience.

Betting and bidding

Integrating low-latency streaming into betting and bidding platforms is transforming these industries. In sports betting and online auctions, every second counts. Timely bets and bids can be the difference between winning and losing. As sports betting and media continue to converge, creating immersive experiences that reflect the live-action without delay is vital for viewer engagement and platform credibility. The deepening ties between these industries underscore the importance of real-time data transmission, making ultra-low latency streaming not just a convenience but a necessity.

Video game streaming and esports

The video game and esports sectors are predicated on the immediacy of player actions and reactions. Delays in streaming can disrupt gameplay, diminish the competitive edge and alienate viewers who expect real-time engagement. Low latency streaming is, therefore, beneficial and essential for maintaining the integrity of the gaming experience. It ensures that the audience and players experience the game as it unfolds without the frustrations of lag or buffering.

Video chat

Video chat has seen a big evolution from a personal communication tool to an integral feature of customer service and business operations. As companies increasingly rely on video conferencing to connect remote teams and conduct virtual meetings, the demand for low-latency video chat solutions has skyrocketed. In these interactions, any noticeable lag can disrupt the flow of conversation, hinder effective collaboration and reduce the perceived professionalism of the interaction. Low-latency streaming services address these concerns by facilitating smoother, more natural conversations akin to in-person discussions.

Remote operations

From telemedicine to industrial automation, remote operations rely on the instantaneous transmission of data and video feeds to operate machinery, provide medical consultations or control unmanned systems from afar. In these high-stakes environments, even the slightest delay can lead to miscommunication, errors or in the worst cases, accidents. Low latency streaming is thus a critical component that enables operators to make informed, real-time decisions based on live data.

Real-time monitoring

Across various industries, real-time monitoring systems are used for surveillance, process control, and safety. Whether monitoring traffic flow, overseeing production lines or ensuring public safety, the effectiveness of these systems hinges on their ability to deliver live feeds with minimal delay. Low latency streaming enables immediate response and intervention, which is crucial in preventing incidents or optimizing processes.

Interactive streaming and user-generated content

The rise of interactive streaming platforms and user-generated content has created new paradigms for audience engagement. Viewers are no longer passive consumers but active content creation and interaction participants. Low latency streaming is fundamental in this interactive ecosystem, where audience feedback, live polling and user contributions are integral to the content. By minimizing delays, content creators can foster a more dynamic and inclusive environment that encourages viewer participation and builds community.

Now let’s discuss something that is gaining momentum in fast streaming. 

Ultra-low latency video streaming

In the fast-paced and connected world of digital media, ultra-low latency video streaming is becoming increasingly crucial. This technology is essential in delivering live content with minimal delay, making it ideal for various applications where real-time interaction is key.

Definition and importance: Ultra-low latency streaming refers to delivering video content with the least possible delay, typically less than one second from camera to viewer. This is crucial in applications where even a slight delay can significantly impact the viewer experience, such as gaming, auctions and interactive broadcasts.

Technological advances: Advances in streaming protocols and network infrastructure have made ultra-low latency streaming more achievable. Technologies like web real-time communication (WebRTC) and secure or reliable transport (SRT) are at the forefront of this revolution.

Applications in various industries: Ultra-low latency streaming is transforming various sectors. Sports broadcasting lets fans see live action almost instantaneously, enhancing their engagement. Online gaming and esports ensure players and viewers experience the game without lags. In the financial sector, it enables real-time trading and bidding processes.

Challenges and solutions: Achieving ultra-low latency is technically challenging, involving optimizing the entire streaming pipeline from capture to playback. Solutions include using efficient encoding techniques, optimized network protocols and leveraging cloud-based technologies for scalability and flexibility.

The future of streaming: As consumer expectations for instantaneity grow, ultra-low latency streaming becomes the norm rather than the exception. The ongoing rollout of 5G technology is expected to reduce latency further, making ultra-low latency streaming more accessible and reliable.

Ultra-low latency video streaming represents a significant leap in live streaming technology, opening new possibilities for real-time online interaction and enhancing the overall quality of viewer experience in various industries.

Bringing DAI and latency into focus

In the dynamic world of streaming video, the role of targeted advertising as a revenue booster is becoming more pronounced. According to recent insights from Digital TV Research, the global revenue from online TV episodes and movie streaming is expected to rise significantly. By 2029, the market is projected to reach $215 billion, a substantial increase from $162 billion in 2023. Within this growth trajectory, advertising-supported VOD (AVOD) is set to outpace SVOD in revenue growth.

This surge in AVOD, forecasted to reach $69 billion by 2029, highlights the increasing importance of ad-supported models in the streaming ecosystem. However, it's not just about the advertisements. The quality of streaming and VOD services, including aspects like low latency that align with live broadcast expectations, plays a critical role in retaining viewer satisfaction and engagement. As the industry evolves, striking the right balance between monetization through advertising and maintaining an exceptional quality of experience will be key to capturing and sustaining viewer interest.

Expert tips to optimize QoE in low latency DAI workflows

Dynamic ad insertion (DAI) allows swapping ads in linear, live and VOD content. DAI workflows enable you to personalize your video streaming services and offer a path to monetizing your content with targeted ads and advanced ad insertion capabilities.

DAI workflows stitch advertising content into your video streams and add more steps to the media processing chain. Whenever the workflow is impacted by additional video processing, it becomes important to ensure that the viewer experience remains, at the least, unchanged or, even better, improved.

The perceived quality of experience can both positively and negatively impact viewer behavior, either keeping subscribers watching your video services for longer periods or creating waves of canceled subscriptions. DAI workflows help create unique experiences tailored to subscribers’ needs and expectations.

However, when you leverage a DAI workflow, latency becomes a risk factor for the perceived quality of experience. A noticeable delay between a live video stream and the broadcast is the kind of streaming experience you want to avoid. 

Here are four tips to leverage DAI while delivering low-latency video. 

1. Leverage CMAF for low latency

Using Common media application format (CMAF) is a solution if you want to deliver low-latency video. CMAF is a media file format that provides a common workflow for delivering live content in MPEG DASH and Apple HLS. As a standardized format, CMAF is backed by an entire ecosystem of more than 60 companies. 

CMAF offers the option to create sub-segment entities called CMAF chunks, often called low latency chunks (LLC). These chunks can typically be in the range of 100 ms, allowing a more progressive delivery of the segments. Delivering the video using smaller entities makes it possible to achieve an end-to-end delivery latency of 5 seconds (typical broadcast latency) without compromising the video quality. 

Another important value of using CMAF is its promise of having a common delivery workflow. You can avoid the cost of duplicate caching for HLS and DASH segments, which is common today. With CMAF, because you only have one set of media files that can be used for HLS and DASH, you don’t need to duplicate caching in the CDNs.

This benefit of CMAF is still a work in progress in terms of being translated into actual production workflows. The industry is currently working on the last details toward using a single ecosystem to deliver live content in DASH and HLS instead of needing two separate ecosystems. 

2. Deploy simple blackout management tools

Streaming services serve diverse audiences across multiple devices, making it essential to manage regional viewing rights effectively. Simple yet effective geolocation-based management tools, commonly called 'blackouts,' are indispensable.

Primarily utilized for live content such as sports broadcasts, these tools enable service providers to restrict access to programs based on the viewer's location. This ensures compliance with regional broadcasting rights and prevents viewers from accessing content in areas where the service provider does not hold distribution privileges. These geo-restriction measures are key to maintaining content exclusivity and honoring licensing agreements across different territories.

Blackout management solutions can simplify this process, handling event scheduling and ensuring you accurately ingest substitute content and slates. They’ll also provide you with an updated manifest to ensure legal commitments are fulfilled.

To efficiently manage and configure a blackout, it’s important to be able to substitute segments dynamically. The dynamic aspect is essential because while ads are planned for a specific time slot, they can be switched instantly based on predefined data to best target the customer. 

A DAI solution that supports blackout management through manifest manipulation allows you to dynamically change or customize the manifest. With advanced manifest manipulation, you can, for instance, create custom manifests per user in real time.

A solution that provides advanced capabilities for blackout management via manifest manipulation will give you the most flexibility to personalize the user experience and deliver the relevant content viewers crave.

3. Deploy a cloud solution for additional flexibility

If you want to achieve the highest possible level of flexibility for the entire DAI workflow, run it in the cloud. Having the entire end-to-end workflow on the cloud eliminates unnecessary deployment complexity and speeds up the launch of video streaming services. A cloud-based DAI approach will give you the flexibility to dynamically substitute ads within live, linear, catch-up TV and recorded content.

You can run an entire DAI workflow on the cloud, including content-adaptive bitrate streaming (ABR) transcoding, ad transcoding, packaging, manifest generation and manipulation.

The cloud's inherent adaptability significantly enhances the scalability of your services. This scalability is vital, as it directly influences the consistency and quality of the user experience. A cloud-based DAI framework provides the elasticity required to accommodate an expanding audience without degradation in service. It generates and manages individualized manifests for each viewer, no matter the size of the viewership.

This level of personalization ensures a tailored viewing experience and expedites the monetization process. By leveraging cloud capabilities, you can swiftly implement targeted advertising strategies, rapidly capitalizing on your content and streamlining revenue generation.

4. Reduce workflow and deployment complexity with a pre-integrated solution

Even after you deploy a DAI solution that supports CMAF, with simple blackout management tools running on the cloud, many different elements are still involved with delivering targeted advertising to end-users, including integration with ad servers, clients and players.

Leveraging a DAI solution that offers an open architecture and an extensive range of pre-integrated ecosystem partners will help to speed up and uncomplicate that process. 

Get a DAI solution with low latency

Harmonic’s DAI solutions benefit from our close working relationship with the essential vendors in the vast DAI ecosystem. As an active member of multiple standards organizations, including DASH-IF, Streaming Video Alliance and CTA Wave, we are ready to integrate advanced capabilities into our VOS360 cloud video streaming and delivery platforms when the ecosystem becomes mature. 

Contact us to discuss any questions you may have or to learn more about our industry-leading VOS360 Media SaaS solution and how to optimize your DAI workflows for low latency. 

Subscribe to our blog

Don't miss out! Keep up to date with the latest by subscribing to email notifications.
No thanks, I am not interested