View All Posts
View All Posts

How to Improve QoE for Broadcast Quality Live OTT Streaming – Part 1

May 14, 2019
3-Minute Read
Vice President, Video Strategy

OTT has become so mainstream that even live content is now available from video streaming providers. In the U.S., between Sling, DirecTV Now, Hulu, YouTube and Sony Vue, there were more than 9 million OTT subscribers at the end of 2018, according to a Fierce Video report. Yet, quality is sometimes an issue, and that’s a problem because consumers expect the same video QoE for OTT as broadcast TV.

Last year’s World Cup was a great example of the limitations of live OTT experiences. Viewers observed low quality on TV sets and a delay of more than 30 seconds compared with broadcast services. If those problems aren’t solved, there may be a halt in the development of live OTT offerings. This blog will analyze some of the solutions that are available today to enhance QoE for live OTT streaming.

 

Why CAE is an Intelligent Compression Approach

One of the ways service providers are battling QoE issues for OTT is through advanced compression methods. Content-aware encoding (CAE), a per-title encoding technique currently used by Netflix, is one such method that supports both VOD and live applications.

What makes CAE unique?

CAE assesses the video complexity in real time and adjusts the encoding parameters to provide the best picture quality. It works similarly to VBR for statmux, except only one program is encoded and the video quality measurement is more refined since it is based on the Human Visual System Model. In order to have a more accurate video quality measurement, the CAE live system is trained offline using artificial intelligence technologies.

We at Harmonic think CAE is the real deal. It’s backed by Apple, Netflix, and the Ultra HD Forum, who has demonstrated a consistent savings of 40 percent vs CBR for UHD ABR using CAE in 2018.

 

How CMAF Addresses Latency Issues

One of the biggest problems facing OTT QoE is the end-to-end latency. With current HLS or DASH implementations, the end-to-end delay is between 30 and 90 seconds. That’s far too much, especially for live sports streaming!

What’s the solution?

There are many ways to solve the latency problem: reduce the segment size, move to a different protocol like WebRTC or use MPEG CMAF. However, reducing the segment size (down to one second) would increase the network traffic, may impact the QoE due to more network traffic and shorter responses time to TCP and has so far not been deployed in any commercial system. WebRTC isn’t cacheable with off-the-shelf CDN servers, requires dedicated infrastructure and has not been proven to scale for millions of concurrent sessions.

MPEG CMAF has emerged as a solid approach that not only reduces latency for OTT but also solves scalability issues. MPEG-CMAF is based on fMP4 (ISOBMFF) and can be used by both MPEG-DASH and HLS delivery formats. What’s even better, MPEG CMAF includes a Low Latency Chunk (LLC) option that enables operators to deliver a segment by small chunks (e.g., 200 ms) before the full segment is calculated, bringing the delay of OTT nearer to broadcast.

In the different trials Harmonic has done in 2018, the latency measured was between 6.5 and 9s using MPEG DASH LLC over a variety of deployment environments (on prem, private cloud, public cloud), networks (managed, unmanaged) and access (fixed, wireless).

 

A New Twist on CDN

CDN is proven to work at scale for the most demanding events (i.e., the Super Bowl, Olympics, World Cup). To be efficient a CDN needs to have enough streaming capacity as well as servers located as close as possible to the clients. It’s no secret that CDNs perform better than others during certain times of the day and in certain locations. That’s why there’s a burgeoning market for CDN selector technology.

Apart from CDN selector solutions, we’re also seeing a fresh take on CDNs, with CDNs being built inside of the operator’s network, using either off-the-shelf technology from CDN vendors or technology developed by the operator itself like Comcast. This new approach means you can serve a certain type of traffic, overload can be off loaded to commercial CDNs and put caching servers closest to the subscriber to ensure the best QoE. The drawback? It’s capex-intensive because you need to deploy the network and opex-intensive because you have to manage the CDN 24/7.

 

Taking Video Streaming Quality to the Next Level

Combined, CAE, CMAF and CDN caching are set to have a significant impact on the future of ABR delivery, enabling QoE and end-to-end latency to reach that of broadcast delivery on a managed network. On an unmanaged network, OTT delivery is still a challenge, as a lot of parameters are out of the control of the service provider. This will be something to keep an eye on as the OTT delivery environment continues to develop.

Subscribe to our blog

Don't miss out! Keep up to date with the latest by subscribing to email notifications.
No thanks, I am not interested