Inside Look at the NHL’s Historic Live Cloud Production of an NHL Game, Alternate Broadcast

Supported by AWS, Evertz, and others, the league launched a new era of production possibilities

When it comes to pivotal moments in the history of the NHL, the evening of March 22 won’t be remembered for a milestone on the ice, but it will be remembered as the start of a new era of broadcasting. On that date, the league, with the help of AWS, made broadcast history with a completely cloud-based live production of the game between the Washington Capitals and Carolina Hurricanes.

For the game played in Capital One Arena in DC, all the camera and audio signals were encoded into the cloud and cut by two NHL crews wielding cloud-based production tools to create two shows: a regular broadcast, produced by an NHL Network crew in a control room at MLB Network in Secaucus, NJ; and an alternative stats-heavy broadcast (NHL EDGE Unlocked), cut by the NHL Studios team in a control room at NHL Headquarters in Manhattan.

The front-bench area of the control room during production of a cloud-based alternative broadcast for the NHL Network

The efforts continue the league’s vision of leveraging new technology to create experiences for new audiences. “We’re on our way to a fully direct-to-consumer distribution model where [fans will create their own] experience,” explains Dave Lehanski, EVP, business development and innovation, NHL. “Do you want stats? No overlays? Want to play a game or bet? [The core production team] does a great job producing [a show] that appeals to our general fanbase, but maybe there are people who want to go deeper or want a kid’s version or even a lifestyle version like the Manningcast. And now the technology is there to efficiently be able to produce all that content.”

The feeds from 10 JPEG XS cameras are sent to AWS Media Connect for re-encapsulating to NDI for cloud routing and distribution, according to Grant Nodine, SVP, Technology, NHL. The NDI routing fabric makes the signals available to the different broadcast tools in the cloud: the switcher, graphics, replay, and everything else.

The two production teams used cloud-based TriCaster Vectar production switchers and Viz Trio for graphics, TVG for the scorebug, Evertz DreamCatcher for replays, and Audinate and SSL for audio. Sienna handled routing of all the feeds.

“Theoretically,” Nodine says, “if we wanted to use Grass Valley for switching, we could do that by changing the orchestration code so that you can license that software and install the right version of each piece of software. But, essentially, that’s all you’re doing: rather than having a Grass Valley switcher sitting in a rack here, we can rent it whenever we want without having to pay for the care of it.”

Live cloud production is still in its early days and for those operating equipment, in particular the replay operators, there is a bit of a learning curve, according to Dustin Robinson, one of the NHL Network’s replay operators.

“There was occasional multi-viewer lag but once we adapted our replays became smoother as the night progressed,” he says. “It was a perfect example of preparation meeting opportunity, and everyone involved should be proud of the successful endeavor. This model, though in its infancy, will undoubtedly push the envelope of the live sports model.”
Adds Steve Blevins, NHL Network, senior lead replay operator: “There are a few differences between the cloud broadcast and a traditional one. The feel is the same, but from a tape room perspective, you receive isolated feeds, and they get repurposed into packages and other in-game rolls. On this broadcast we used the DreamCatcher by Evertz which offered many new and enhanced features to strengthen the broadcast.”

The Evertz DreamCatcher replay team cut replays in the cloud rather than on local replay servers.

The flexibility to choose the preferred tool is important to AWS, which considers being software-agnostic to be paramount. “We started by asking the NHL what vendors and software they wanted to use,” notes Andrew Reich, senior consultant, sports, AWS. “We’re providing the infrastructure, and the idea is, we should be able to support and supply any software vendor needed for a production.”

Latency is always a concern, but, according to Nodine, it was only three frames from the venue to the cloud, and, with JPEG XS for the return, overall latency was around seven frames.

“There’s no latency for the operator,” he notes. “It just comes down to the end-user latency of the person receiving the broadcast. We can drive down the actual overall latency of the broadcast to the point where it’s possible to start feeding the game back into the venue for distribution on mobile devices without the fans’ being jarred by the difference in game play.”

Three-Year Project

The cloud efforts began in 2021 when AWS and the NHL worked together on building a cloud-based encoding and scheduling pipeline whereby the league would take feeds from every venue, pass them through AWS, and make them available to distribution partners and broadcast personnel. The next step took place last fall during the NHL’s Tech Showcase in Seattle last year.

“We started asking. ‘How can we get camera and audio feeds into the cloud and then produce a game?’” says Nodine. “We did a shadow production at the showcase, which was a game between the Ducks and Kraken. We had all the components, replay, graphics, switching, and editing in the cloud. What was really cool was that our technical director was in Wisconsin and the replay operator was in Canada.”

From there, the concept continued to grow. For a cloud-based production in Melbourne, Australia, the world feed was regionalized with different graphics and edited packages for different regions. “The great thing about that [broadcast],” says Nodine, “was that we showed up with one small Pelican case with two Nvidia ENC encoders, plugged it in, grabbed the feeds. [We had] touch panels and a monitor on a table, and, from that one table, we could do all the graphics, editing, and switching.”

AWS believes production people and others need to be educated on how much flexibility a cloud-based environment provides and how it could change everything from not only how events are produced but how new staff can learn new skills. “If somebody who’s just learning the industry wants to learn how to operate graphics,” Reich points out, “they don’t need to go into a production truck to shadow and are not constrained to being trained in that physical space. They can have access to a workstation and learn how to operate graphics from their home or really from anywhere.”

The cloud also allowed the replay operators to have more access to more feeds. Nodine says. The replay operator was given permission to have program feeds from every other broadcast.

“That is something you cannot do in a truck without spending a lot of money on in-bound fiber feeds,” he points out. “You could clearly get to the point where you have a replay system, a couple of Ethernet cables, an Evertz controller, and a little NOC to connect you to a cloud instance, and you are off to the races. You can have a truck that can do replays without having a replay system in the truck. All you need is the network connectivity to the cloud.”

Key AWS elements involved in the project include Media Connect and the EC2 servers hosting a lot of the workstations and operator stations. “The great thing is,” Reich says. “if someone is editing and realizes their workstation isn’t powerful enough, that is not a problem. We can quickly change that instance size or server size or cloud computer size to give it more RAM and more horsepower to meet their needs. And we have high-priority storage so that the technical director can access files.”

The Sky’s the Limit

Cloud productions like this one point to a future where one set of live cameras can be made available in a cloud environment to almost limitless production teams. Rightsholders around the globe potentially could cut their own local versions instead of relying on a single world feed. And the ability to spin off alternative broadcasts becomes much simpler: there is no need to roll up additional production trucks and trailers to compounds that are increasingly tight.

“For something like the NHL Stanley Cup Final,” says Nodine, “the U.S. rightsholder is bringing in trucks, and the Canadian broadcaster brings trucks. You’re parking eight trucks in the broadcast compound, and that is no easy challenge.”

Along with the production benefits comes sustainability. “One of the biggest benefits of live cloud production is sustainability,” says Reich, “in addition to cost savings and ultimately opening up new opportunities for more revenue streams for the NHL.”

Nodine’s goal is to get iso encoding for all the broadcast feeds installed in all the venues in a way that would allow a button to be pushed and a cloud broadcast orchestrated from a given venue. “The NHL can leverage all those iso feeds for other key things,” he points out, “like cutting highlights or producing much better melts. The quality of highlight operations could increase greatly just by having all the isos instead of the ones that occupy the replay channels.”

When Cloud Meets AI

The concept of personalized broadcasts has been a holy grail for decades, the idea being to create a new way to keep fans tuning in more often and for longer. Coupling cloud production with AI and automated production tools could finally make that idea a reality, Lehanski opines.

“I don’t think we’re anywhere near to putting out an end product without a lot of human involvement,” he says, “but there are some obvious use cases, such as using AI for language translation: taking a feed and, boom, it’s in another five or 10 languages. And we’ll get into more ideas like having the system know what you want and creating a version that is automated for you. But that’s further down the line.”

No cloud productions are currently planned for the rest of the season, but the team will sit down in the offseason and figure out next steps. “There’t s obviously a multitude of options,” says Sean Williams, VP, innovation, tech partnerships, NHL. “We’re just trying to understand the systems and making sure things work in a production environment the way we expect them to.”

Lehanski knows what he hopes to accomplish next: “I hope that, at the next event, we’ll do four productions and they’ll each be to a different audience. From a business standpoint, you can aggregate a bigger audience because you won’t be relying on one platform; you’re going to think about all of them and what the experience is for the audience on each of them.”

Password must contain the following:

A lowercase letter

A capital (uppercase) letter

A number

Minimum 8 characters