SVG Sit-Down: NVIDIA’s Jamie Allan on the Transition to ST 2110, What’s Next for AI, AR, and the Cloud

The focus is migration from client-based system to software-defined virtualized infrastructure

At the IBC Show in Amsterdam this month, NVIDIA’s tech infrastructure once again powered hundreds of booths. In addition, the company demonstrated SMPTE ST 2110 workflows at the Dell Technologies and RED Digital Cinema booths.

The next-generation IP broadcast workflow at the Dell booth focused on how to simplify the adoption of SMPTE ST 2110 standards for the broadcast industry. NVIDIA and Dell teamed up to showcase IP-based content-creation capabilities and deployment of AI in the broadcast pipeline from workstation to the edge.

At the RED booth, NVIDIA networking technologies (Rivermax, ConnectX, NVIDIA BlueField DPU, NVIDIA RTX GPU) enabled real-time 8K raw video over ST 2110. In this demo, NVIDIA and RED showcased a direct connection that allows cinema-quality RED V-RAPTOR 8K content to feed into an IP broadcast-production workflow.

During the show, SVG sat down with Jamie Allan, lead, media, entertainment & broadcast industry, EMEA, NVIDIA, to discuss the ST 2110 demos and how NVIDIA is helping power major next-gen technologies, such as artificial intelligence (AI) and machine learning (ML), augmented reality and immersive experiences, and cloud- and edge-based workflows.

NVIDIA’s Jamie Allan: “The broadcast industry should build in a way that allows [deployment] on any platform in the future.”

What are the big themes NVIDIA is highlighting this year at IBC?
Firstly, we’re excited to be back at IBC after being away the past few years. Getting together with our amazing ecosystem of partners in Amsterdam is always great. Having NVIDIA in hundreds of the booths, powering the solutions and technologies that make up the media and entertainment world, is a great honor for us.

This year at IBC, we are focused on talking about and demonstrating some of our groundbreaking solutions that simplify the adoption of ST 2110 workflows for broadcasters, postproduction companies, and large media organizations. These [solutions] enable these organizations to easily bring SMPTE ST 2110-compliant uncompressed streams into their infrastructure without a huge engineering uplift.

Can you provide some detail on the key demonstrations you’re participating in here at the show?
At the Dell booth, we are showing how our existing Rivermax SDK — which is already used by many leading broadcast organizations, such as Grass Valley and Disguise — can create a new Windows application that provides a virtual display for a 2110 platform. You can take your normal Windows virtual or physical workstation and virtualize a SMPTE ST 2110-compliant desktop as a second display. You can simply drag an application to your second display and send that out to broadcast live on-air as a 2110-compliant stream.

On the RED Digital Cinema booth, we are enabling the world’s first real-time camera-to-SRT stream via uncompressed ST 2110. We’ve worked with RED to develop the capability to go straight from one of their new camera models, fire an IP module into a processing unit doing uncompressed 8K 2110, and take that stream into either an uncompressed pipeline or an SRT compressed webstream at full 8K. We believe this is the first time that has ever been done.

How do you see AI and ML changing the way live sports are produced? And what role is NVIDIA playing in that evolution?
The broadcast industry as a whole has adopted AI on a much larger scale, and we’ve seen many broadcasters using these tools over the past few years.

Organizations like EVS, Sony Hawkeye, and Vizrt are advancing their tools with AI. And many broadcast organizations and media companies are investing internally in AI data-science teams and developer teams to take some off-the-shelf AI tools that you can get from places like NVIDIA’s GPU container cloud and retrain and adapt them to create specific tools for their needs.

That is very important in the sports industry because of the complexity and the unique needs of each individual sport. We work very closely with organizations like Hawkeye to enable their tools to work specifically for certain sports.

I also think AI and machine learning in automated production is very interesting. We are seeing organizations like Pixellot and Mediapro’s AutomaticTV growing at an astonishing rate. We will continue to work with these companies to create smaller and faster components for that part of the ecosystem so that technology can continue to grow.

There are also many startups focusing on creating groundbreaking applications for AI in sports broadcasting. One company in particular doing groundbreaking work in markerless motion tracking is move.AI. They are a UK-based company who have [drawn interest] from many major sporting bodies and broadcasters around the world. Using low-cost cameras, they can create full-body 3D visualizations, which is something every broadcaster wants in order to add value not only to their current 2D broadcast pipeline but also to their future immersive metaverse and Web3 broadcasting capabilities.

Speaking of the metaverse, how have you seen the use of augmented reality grow in recent years? How to you envision these virtual technologies impacting the industry?
We’re incredibly proud that nearly every vendor who creates augmented-reality and virtual-graphics tools leverages NVIDIA technology to build their products. We continue to push our engineers and our internal product teams to give them more and more capability in that space. The next step that we are hoping to see is bridging the gap between AR in the studio and AR in the home. We’ve seen visionary pieces from amazing partners like BT Sport, who have done a huge amount of work with 5G Edge XR, which has been a big hit here at IBC.

In addition, companies like Brainstorm, Disguise, Vizrt, and Zero Density are striving towards this multi-layer–rendering capability, where it should be easy to extrapolate an AR element and have it delivered by a different device than just your TV. That’s when we can start to talk about this multi-experience broadcast way of consuming content.

The vision of having a football match played out on your table from a top-down view is nearing reality. We want to see these technology companies getting there soon because the consumers are asking for it. There are certainly barriers to overcome on the scale of the computing power today and the limitations of traditional content-delivery networks. The next generation of augmented reality and high-fidelity visualization will need more edge computing to deliver those experiences. We hope to partner with the industry to leverage infrastructure that NVIDIA and our data-center partners have today to deliver those experiences to the consumer.

How has NVIDIA factored into the cloud-based revolution in the broadcast industry? How will this shift toward the edge and the cloud play out over the coming decade?
The media industry has certainly been a huge adopter of cloud computing over the last decade, but I think many were primarily focused on reducing their capital expenditure. I think people are finally realizing the advantages of cloud-native computing. The industry is moving towards a software-defined broadcast ecosystem where every ISV or vendor or tool that you want to use can be run anywhere you want on whichever hardware platform you choose — all with an enterprise IT management layer. Being able to build out those infrastructures from edge to private to public cloud is our goal in how we build out our technology.

When NVIDIA builds its core capabilities around containerization and virtualization of GPU and networking, we make them in a way that they can run anywhere. When our ISV or hardware partners want to deploy that tool at the edge, they know that they can trust that NVIDIA’s drivers and virtualization technologies will work better. When they want to run those things in public cloud at massive scale, they know that the same core NVIDIA technologies that they use at the edge will run at scale in the cloud. We never want to create our core foundations in a way that lock you into deploying a certain place or a certain way. And that’s how we believe the approach should be for the broadcast industry: build in a way that allows you to deploy on any platform in the future. This migration from a client-based system to software-defined virtualized infrastructure allows you to be a much more agile and flexible technology company.

There has been a lot of discourse this weekend about refocusing on technology within the broadcast industry. At the Sky Summit earlier this week, Sky CEO Dana Strong talked about how technology is at the beating heart of Sky’s entire organization. Every broadcast organization needs to adopt that philosophy. If you don’t start building in this decentralized, software-defined manner, you will struggle to grow and expand in the direction that the industry’s going to move in the next few years.

This interview has been edited for length and clarity.

Password must contain the following:

A lowercase letter

A capital (uppercase) letter

A number

Minimum 8 characters