SCMS 2018: How Machine Learning Can Make Sports Workflows More Efficient

The key is applying the vast amounts of data in a meaningful way

Artificial intelligence may be the hottest topic in any industry right now, but how are sports-content producers using machine learning in real-world scenarios? That was the focus of a panel discussion at SVG’s Sports Content Management & Storage Forum last week.

Nick Gold, lead technologist, Chesapeake Systems, started out the conversation by introducing machine learning and its importance in the ability to apply data to workflows and tasks in a meaningful way. One of the issues raised was related to the vast amount of data pumping through asset-management and other systems.

The AI/Machine Learning panel at SCMS 2018: (from left) Veritone Media’s Drew Hilles, Google Cloud’s Todd Reedy, University of Notre Dame’s Scott Rinehart, CatDV’s Jeremy Strootman

“More data is not good data,” said Scott Rinehart, director, broadcast technology, University of Notre Dame. “At some point, you need to stop and sift through the data.”

Can machine learning solve that challenge of separating good from bad data? Ironically, at some point, people will need to be involved, teaching the machine to better understand what matters and what doesn’t.

“A lot of data is useless, and you have to educate people how to parse the data and get rid of the noise,” said Jeremy Strootman, VP, business development, North America, CatDV. With the noise gone, the team can turn to the important task of figuring out whether the data can be used to save money by streamlining processes or make money by creating new revenue streams.

“We are in a big education phase,” said Drew Hilles, SVP/GM, Veritone Media. One part of that phase is making sure clients are realistic about the capabilities of AI. Automated captioning, for example, may be only 90% accurate, but, in some cases, 90% accuracy may be good enough.

Todd Reedy, customer engineer, Google Cloud, noted one misunderstanding by those who turn to machine learning: they may not understand that as well as they understand upgrading a regular piece of equipment, such as a switch.

“The industry is at the learning phase of how we wrangle extra data into something more meaningful,” he added. For example, by using AI and machine learning to do some of the more generic editing or clipping decisions (like a scoring play or penalty), the creative team can focus on finding additional material that adds emotion to the piece.

Rinehart said the Notre Dame Studios team is trying to figure out how to take a very large dataset and apply it to the rest of the university. In the athletic department, it helps creative teams find shots of coach and player reactions or crowd shots more easily. And, with automated transcription, an interview can become searchable within a matter of minutes of its conclusion.

“If we can solve that problem in one part of the university,” he added, “how can it apply to other areas, like for lectures so that students can search against it?”

There is also the possibility of applying multiple machine-learned databases to content to offer an even greater level of specificity. For example, machine learning that identifies an injury could turn that information over to a machine that can provide more information about treatment of that injury.

And the machines are increasingly a quick study. Hilles noted that Veritone Media recently took 1,400 facial images for CNBC and used machine learning to train a system to recognize all of them in 60 seconds.