AI and Broadcasting: The Best Use Case

If you’ve read my writing before, you know I love to pick a topic and rant. Been doing it since my Television Broadcast magazine editing days. So here we go. But first, a little introduction…

If you attended IBC 2019, or read about the latest developments at the show, you probably ran across a few things about artificial intelligence (AI) and production, probably from the BBC.

In fact, the IBC Best Conference Paper Award went to a team from BBC R&D for its paper AI in production: video analysis and machine learning for expanded live events coverage. The paper details a project known as “Ed.” Ed is a prototype system that uses AI to create “near-live” content with a minimal crew (read as: you still need people…for now). An example given would be a live production shoot with three unmanned 4K cameras that Ed would control and switch to derive a suitable HD program.

The BBC has been showing off AI-related technologies at IBC since 2017. And the industry talk about AI and machine learning (ML) has only increased during the years since. AI and ML can be used for production, post, news, and more.

In fact, here’s the latest from BBC R&D on AI, first published on the BBC’s blog on September 24, 2019 and recently updated on November 22, 2019, entitled AI & Auto Colourisation – Black & White to Colour with Machine Learning:

Recent advances in deep learning have enabled the automatization of many traditional production tasks that have the potential to transform the way BBC makes its programmes. Here at BBC Research & Development, we are researching how the quality of video could be enhanced by artificial intelligence and in particular how video can be automatically colourised using some of the most recent breakthroughs in machine learning. As a result of our research, we are proposing a new and original algorithm that is capable of performing this task even more efficiently, making images and videos look more colourful and realistic.

As anyone watching colorized shows from the 1970s can tell you, initially colorization was — to be subtle — awful. But the technology has improved. We’ve even seen colorized versions of I Love Lucy and The Dick Van Dyke Show air during prime time on CBS in the US. And they look…great. They even got Lucy’s hair the absolute right color. (You can read about how I Love Lucy was colorized here.) The BBC proposes to automate the colorization process with AI, but keep in mind that there was a lot of back-and-forth between human beings at CBS and West Wing Studios about the color pallet leading up to technicians then starting the 45- to 60-day colorizing process.

While that makes for great entertainment, there might be a way to make your viewers happier (or not as angry with you). What I’m going to propose is a solution for a problem (especially with national cable networks) that has most likely not pushed a viewer to change the channel. They still watch — they’re just not happy about what you are doing.

Here’s the rant part.

We’ve all seen it. We’ve all hated it. The lower-third or subtitle being “stepped on” by a promotional graphic. Not the “hidden in the corner” semi-transparent network bug. But a full blown opaque (and possibly moving) graphic promo, making viewers wonder who’s talking, where are they from, why you would do this so they can’t read the subtitles of the guy whose English is horrible (which is why he’s subtitled). OMG, in the age of IP and 4K, we still can’t manage to not cover content? Really?

My proposal, free for any company to use (mainly because it’s so obvious that it’s probably not patentable, but should have been put in place by now): Use AI to stop this stupidity.

It’s simple. An algorithm is used to determine a few simple things before a promo is put on-screen during a program:

  • Is there a lower-third in the frame?
  • Is there a subtitle in the frame?
  • Is the person who is now on frame been seen in the program already (facial recognition to determine if a lower-third or caption was used for this person before)?

And, actually, you don’t need to do the last bit, if the AI reviews the program (at much faster than real-time) to mark the timecodes of lower-third and subtitle events.

Then let AI handle the timing of the promos. Let it know what promos you want during the show and approximately when you want them to occur. That’s it. No more problems.

This can’t be so complicated that it would be hard to do or even costly.

So, for heaven’s sake, use AI and ML for something that makes your viewers happy (or, at least, less angry).

There…I’m done.

Michael Silbergleid is president of Silverknight Consulting in Fort Myers, FL.