Inside the ‘MultiVersus NHL Face-Off’ Live Animated Broadcast With the NHL’s David Lehanski and Keith Horstman

The league teams with TNT Sports, Warner Bros. Games for latest twist on alternative broadcasts

Bugs Bunny and the Tasmanian Devil; Batman, Superman, and Wonder Woman; Velma and Shaggy; Steven Universe; and Finn the Human will be among the Warner Bros. Discovery characters taking center ice on Sunday during the latest iteration of real-time animated NHL broadcasts. The league is teaming up with TNT Sports, Warner Bros. Games, and Beyond Sports to produce the first-ever MultiVersus NHL Face-Off, marking the fourth animated NHL broadcast to date.  

Presented exclusively on truTV and Max’s B/R Sports Add-On, the animated presentation of the Colorado Avalanche at the Golden Knights will use the league’s NHL EDGE positional data (puck- and player-tracking powered by SMT) and Hawk-Eye Innovations optical tracking to generate realistic character and player- avatar movements to best represent the movements of the NHL players on the ice. Beyond Sports’ AI-based data-visualization platform will leverage that data to re-create the action on the ice as it happens in real time, featuring the animated characters of WB Games’ MultiVersus videogame alongside avatars of NHL players.

In addition, the ice rink for the alternative telecast will be dynamic and incorporate rotating elements based on environments from the MultiVersus game, such as the Space Jam Court and Adventure Time Tree Fort. Announcers Steve Mears and Colby Armstrong will call the action while interacting with WBD characters.   

The MultiVersus NHL Face-Off marks the fourth chapter in the league’s animated broadcast efforts, following the inaugural NHL Big City Greens Classic with ESPN in 2023, the second edition of the NHL BCG Classic last month, and NBC Sports Chicago’s alternative presentation of last week’s Chicago Blackhawks-Dallas Stars game.

SVG sat down with David Lehanski, EVP, business development innovation, and Keith Horstman, VP, technology, to discuss the production workflows and technology being used for Sunday’s broadcast, how the production model has evolved over the past year, and what to expect in future NHL animated presentations.

How did the concept for these animated broadcasts come about in the first place?
Lehanski:
This all started a couple years ago when we first met Beyond Sports and saw their capabilities firsthand. Originally, we were just looking to leverage the technology to help us validate the positional data of the puck- and player-tracking system that uses sensors on the jerseys and the puck. Beyond Sports took those data points and created a virtual rendering so you could visually see the players and puck on a virtual rink in real time. But, once we saw the rendering of that data, we realized this could be a pretty amazing viewing experience on its own.

That eventually led to our first initiative to do this live on-air, which was with ESPN and Disney and the Big City Greens property. It went well and helped us reach the audience we were going for: younger viewers, who didn’t necessarily fall into the traditional NHL [demographic]. That opened up everyone’s eyes. Although it was a very cool concept before that, [the NHL Big City Greens Classic] is when it became something that could have real value and viability.

And when did TNT Sports and Warner Bros. Discovery come into the equation?
Lehanski: TNT Sports and Warner Brothers Discovery are great partners of ours, so we had been talking to them about doing some kind of animated broadcast early on. They have so much incredible IP that it was hard to figure out what [properties] to leverage. But [the debut of] this MultiVersus game answered that question. It’s a mashup title that brings together a lot of different characters from across all kinds of different IP into one game. That gives us the ability to leverage a multitude of [properties]. We worked with TNT Sports and WBD to see what was possible, and it developed quickly from there since we already had the technology to make this happen.

How are the WBD characters integrated into the Beyond Sports environment to create this animated broadcast?
Horstman: Since it’s a [gaming-engine environment], the [onboarding] of the characters into our platform was streamlined. The rigs for the characters [have been developed], so the handoff to Beyond Sports to form out the skeletons was very smooth. That will be the case going forward when working with any other game platforms as well.

The primary movements of the characters are already in the rigs, so it’s a lot less [work] at the outset. Obviously, for things like getting Batman’s cape to move around while he’s skating, it’s a bit more complicated since that requires developing new movements. But, overall, the process of bringing the characters themselves into the Beyond Sports environment was a lot smoother than it was [for the previous animated broadcasts].

WB gave us the rigs exactly the way they wanted them to look, and Beyond Sports tinkered with them to ensure they will fit accurately into the environment. We stay as true to the characters as possible, but there has to be some calibration for gameplay and movement on the ice.

How will these WB characters be integrated into the game action from a storyline perspective?
Horstman: Just like Big City Greens [Classic], these characters will be playing in the game. Some characters will be representing particular players from both teams on the ice, and other characters will also be integrated into the environment around the game and in the arena.

Another added element for this one will be that the environment is going to change throughout the game. In the MultiVersus game, there are several different maps that you can play, all born from one of the character stories. We’ll integrate those into our presentation. It won’t be a fixed environment; you will see it morph as the game plays on.

The ice rink for the alternative telecast will be dynamic and incorporate rotating elements based on environments from the MultiVersus game.

Have you made any major changes/improvements to the overall tech stack for this broadcast?
Horstman: We are enhancing a few pieces of the overall solution in terms of the tracking data. We currently have two tracking environments: SMT for the traditional single-point tracking and Hawk-Eye for optical multipoint tracking. We primarily use SMT data for the puck and Hawk-Eye data for the players, but we have the ability to switch back and forth between the two so that, if there is an issue with either platform, we can switch and still have the players and puck consistently appearing on the ice.

One issue we found was that the two disparate systems are not necessarily timestamped at the same moment: there is a small double-digit–millisecond difference in time. We worked with Beyond Sports to automatically synchronize the two systems. When a player is winding up to take a slap shot and executes the slap shot, the puck will come out when it’s supposed to come out; not 20 ms later. It was an issue that we had previously, but the synchronization now is put into place automatically during the pregame skate. That’s the big technical thing we’ve worked since last weekend’s [NBC Sports Chicago] production.

How will the live broadcast itself be produced, and where will the crew be located?
Horstman:
The production will be coming out of Atlanta, and TNT Sports will be producing the whole game themselves. We are sending a couple of our production people who are experienced with this, and the Beyond Sports team will be there as well.

From a production perspective, we’ve architected [a model] so that the data creates a skeleton and then the production side [creates] the overlay for the rigs with whatever IP we’re using for that particular game. The construction of the infrastructure is almost exactly the same as it was last weekend or at Big City Greens [Classic]. That’s the beauty of how Beyond Sports has built this [workflow]: we can have one data capture and one skeleton creation but have it distributed out over multiple platforms with many different character sets.

In terms of the production itself, we’ve tried to keep it traditional so that the [production team] is as comfortable as possible. We have a director and producer and TD cutting cameras with the Beyond system. There are 50 cameras available out of the box for the Beyond Sports system — you can add more from there if you want — and you can cut cameras just like you would with a traditional production. The only difference is, you’re not limited to the traditional camera angles.

Since the NHL is aggressively exploring live cloud-based production, does the cloud play a role in the current production model?
Horstman: All the data comes out of the arena and is distributed in the cloud, but the graphics-rendering pieces are physical hardware that will be in Atlanta. That hardware will receive the data from the cloud and render out with the Multiversus characters. But we are working to put that [part of the workflow] in the cloud, hopefully for the coming year. We’re trying to keep this simple from an architectural perspective. Once we port it all to the cloud, that’s when we’ll bring in AWS potentially to provide it for distribution.

Where will the announcers be located, and how will they be integrated into the broadcast?
Horstman:
For the announcers, it’s going to be a Roger Rabbit-like production, where there will be a human component mixed with an animated component on-screen. The announcers will be in Atlanta. They will be human characters shot with traditional video but able to interact with animated characters.

In your view, just how far have these animated broadcasts come over the past year since the first NHL Big City Green Classic?
Lehanski:
It has grown by leaps and bounds in every possible way. It’s amazing how much we learn each time we do it. We’ve done only three of these live, but we’ve done a lot more behind the scenes and continue to learn. We’ve come so far, and the technology is only getting better. The coordination of the graphics and the skeletal and stick-positioning data has improved tremendously from where it was last year. If you put [Sunday’s broadcast] side by side with that first [Big City Greens Classic], you can see a noticeable difference in the way we produce the game and how much we’ve enhance [the broadcast].

When you focus on the objective, which is reaching a younger and more family-oriented audience, you realize that we’re rethinking the entire presentation — not just the game. It’s in the opening, the pregame, the intermissions, the features, and the way you position everything. There’s still more work to be done there to make it complete, but you’re seeing us getting better with each and every game.

Can we expect more of these animated broadcasts in the near future and, if so, when?
Lehanski:
The good thing is that we’re getting to more of a repeatable solution that has a lot of cost-efficiencies built into it. There’s no doubt on our side that there’s a plan to continue to do this, and we’re working with our media partners and our clubs to figure out the right cadence and scale in terms of how many of these we do, at what level, and when. But you’re going to see more of it for sure in the future.

How do you see the production workflow continuing to evolve and improve for future animated broadcasts?
Lehanski:
Longer term, our goal is to have all data feeds and all the production pieces built in the cloud, so that people can produce fully remotely. There’s no reason there couldn’t be multiple versions of the same game because, at this point, we’re simply wrapping animation and rigs around data. And there’s no limit to how much of that you can do. There’s going to be a lot more of that coming down the road as well.

With three animated broadcasts in the books and another on the docket this weekend, how proud are you of your team for their work thus far?
Lehanski: Just like a lot of folks, I spent a lot of time as a kid sitting on the couch watching cartoons on Saturday morning. I never would’ve believed that there would be a moment in time when there would be a live sporting event where you could watch those same cartoons compete. The concept is still pretty mind-blowing. But it has been a massive effort across many different organizations — the media partners, the animators, the production people, Beyond Sports, and our team internally — coming together to make this happen.

As proud as we are, we’re obsessed with how we can make it better to achieve the greater purpose. That greater purpose is leveraging this technology to grow the game and bring the live game of hockey — not ancillary content but the actual live game — to a new audience in a format that’s more appealing to them. We want to help them understand why this is such a great game, and the results so far are showing us that that’s exactly what’s happening.

This interview has been edited for length and clarity.

Password must contain the following:

A lowercase letter

A capital (uppercase) letter

A number

Minimum 8 characters