Blog

3 Challenges Broadcasters Can Solve Using Big Data

Written by Chris Pattinson | April 1, 2019

We’ve all heard the term “big data.” What is big data, and how does it apply to the broadcast industry? Big data refers to the massive amount of data that businesses — including broadcasters — receive on a daily basis.

While big data alone isn’t necessarily advantageous to broadcasters, when analysis and machine learning are applied to big data, it can have a positive effect.

This blog will examine three key challenges that broadcasters are facing and how big data can help.

 

Challenge 1: Controlling Manual Labor Costs

One area where big data can be beneficial to broadcasters is controlling manual labor costs. As systems scale, in terms of there being more channels and workflows to monitor, machine learning can assume some of the monitoring responsibilities that were previously handled by humans, freeing you up to perform other tasks.

Prior to machine learning techniques, the “eyes on glass” method of monitoring was used. The main drawback of this approach is that it’s expensive, requiring several people to continually monitor the video output 365 days a year, 24 hours a day.

There’s a trade-off.

If you cut back on monitoring staff, you run the risk that issues are not identified immediately. If you keep a full monitoring staff, the costs are increased.

Furthermore, there are additional costs involved with pulling and analyzing logs in order to identify where exactly the problem is happening and then taking corrective actions.

 

Challenge 2: Mitigating Service Outages While Scaling Up

There’s an expense involved — both cost and time wise — with identifying the source of service outages. Using big data, broadcasters can achieve and maintain very high degrees of service reliability while scaling up to deliver more channels.

Without assistance from big data analytics, the time to identify a problem can be much longer, especially if a fixed number of operators are monitoring a system that is continually being expanded to accommodate more channels or services. This means outages can last longer for end users, creating viewer dissatisfaction.

A larger system with hundreds of channels typically has more complexity since it requires additional layers of networking infrastructure. With a system this big, imagine how long it would take you to pull the logs and analyze where the problem is occurring?

 

Challenge 3: Maintenance Efficiency

Another benefit of analyzing big data in the broadcast environment is the ability to perform maintenance quickly and efficiently. This includes system software upgrades and introducing new features.

Without big data, manual testing and validation using the traditional “eyes on glass” methodology requires multiple rounds of video/audio review for any configuration change or optimization.

This significantly increases the cost of testing and deploying upgrades and/or configuration changes related to existing or new features. Big data analytics enables you to compare the heartbeat pattern of system behavior quickly and efficiently across the entire system to identify any changes in performance, with minimal manual effort.

You can rapidly identify the point of failure or area of concern, and once remediation is applied, the overall system can be confirmed to return to expected operational behaviors.

 

In the end…

Big Data is sort of like the “cloud”. Alone, big data is not solving business problems for broadcasters.

It’s the application that matters.

When analytics and machine learning are applied to big data, broadcasters can reduce manual labor costs related to monitoring, scale up their services while maintaining high reliability, and improve maintenance efficiency.

 

--

If you are interested in learning more on how big data can be applied to increase service reliability, Chris Pattinson, VP of Global Software Quality, will be presenting his paper, “Using Big Data and Workflow Redesign to Enhance Broadcast Service Availability,” on Tuesday, 9th at NAB 2019.