Paris 2024: OBS CEO Yiannis Exarchos on AI-Generated Highlights, 360 Video, 8K Processors From Intel

Intel’s ‘Software Defined Broadcasting’ will help OBS serve growing digital demands

The 2024 Paris Olympics are shaping up to be the latest global sports event to prove that the future of live event production is not only in traditional linear needs but in growing digital needs. Speaking at the NAB 2024 last month, Yiannis Exarchos, CEO, Olympic Broadcasting Services, noted that OBS plans to broadcast about 3,400 hours of competition but that close to three times that amount of content will be produced: “We should see about 11,000 hours of content, 450 days’ worth of content produced in a period of just 17 days. Why? To feed the insatiable beast of digital.”

OBS hopes that broadcasters will publish more than a half million hours of content, with about half the earth’s population experiencing the Olympic Games. Content of every type at a very high level of quality will be produced for traditional television, streaming, digital, social, and other outlets. All of this is to serve younger generations as well as niches of underserved fans in far-flung geographies.

“If we talk about younger generations, demographics, and the proliferation of digital media,” said Exarchos, “of course we have to look at it in a different way. The Olympics is a big problem of scale.”

He described the engineering feats that will be accomplished at the Paris Games this summer. “We will have to install 450 racks of equipment around 45 venues, 50 OB vans and set up 70 galleries with a very demanding configuration. In addition, thousands of highlights need to be cut within seconds after the event — or even during the event.

“In collaboration with Intel,” he continued, “we have been using their Geti AI platform. It’s not that it saves us: we were not able to do it before as nobody has the resources to be cutting so many highlights for so many events.”

Olympic Broadcasting Services’ Yiannis Exarchos: “It’s not just about efficiency. It’s about providing new opportunities to the creative part of our industry.”

In addition, Exarchos touched on what was previously known as volumetric replays: the ability to deliver 360-degree, “Matrix-like” replays. The ability to turn this kind of highlight around quickly enough for it to be used in the moment is possible only with efficient, low-latency IP transmission.

For four events across three venues, all the live production — and the postproduction in connected facilities — including 8K workflows will be run on the Intel Xeon scalable processors, commodity off-the-shelf hardware, and what Intel calls Software Defined Broadcasting.

“We are using Intel’s processors to transform broadcast,” Exarchos said. “We have lightning-fast, ultra-low-latency content transmission from the venue to the outside-broadcast fans, facilitating real-time content production and highlights creation, elevating the fan experience.

“And I would say something more,” he continued. “We are allowing the storytelling to become more aggressive, more interesting, more creative, going several directions. It’s not just about efficiency. It’s about providing new opportunities to the creative part of our industry, doing things that were not really possible. Actually, we multiply the creative possibilities.”

Software-Defined Broadcast Is Intel’s ‘Polaris Point’

Much of what Exarchos detailed on the content side was outlined from the solution perspective by Nagesh Puppala, GM, edge/cloud video, Intel. He laid out the framework for an open-source, virtual outside-broadcast (OB) van infrastructure using commodity off-the-shelf hardware — a solution framework code-named Polaris Point.

Intel’s Nagesh Puppala described the Polaris Point solution framework.

Polaris Point has three core parts, Puppala said. “It’s an open-source implementation of ST 2021, and it couples that with Kubernetes and Helm [software], running on standard IT servers. It includes Intel Media Transport Library, an open-sourced implementation of ST 2110; Media Communications Mesh, an open-sourced, optimized-for-media microservice for low-latency communication; and a JPEG XS codec, which has been just recently open-sourced.”

In Paris, the solution will address provisioning, orchestration, and lifecycle management for virtual OB-van architecture. Notably, the solution uses the same media-capable infrastructure across both live-production and postproduction workflows regardless of location. “The same architecture being used for the production side is also being used at the International Broadcast Center for the postproduction activities,” Puppala said. “That brings dramatic efficiencies in physical engineering as well as flexibility in human-resource deployment.”

Exarchos framed much of the discussion with some historical context. “When we spoke about cloud in 2018, many of our colleagues were raising eyebrows,” he pointed out. “You know, they would ask, Over the cloud? High-definition, 4K video, live, over the cloud? Well, I have an update: 40% of the international transmission in Paris will be done over the cloud.”

Password must contain the following:

A lowercase letter

A capital (uppercase) letter

A number

Minimum 8 characters