SVG Esports Production Summit: A Closer Look at the Technology Behind the Show

Blackmagic Design, Canon, Grass Valley, Ross Video, Swiftstack execs discuss the tools deployed

In recent years, no live-production sector has pushed the boundaries of technology more than esports has. In an effort to create a unique fan experience, live-esports producers have embraced a wealth of next-gen technologies and workflows, including AR and virtual graphics, UHD and HDR cameras/lenses, cloud- and object-based storage, data and stats integrations, and the at-home–production model. At the SVG Esports Production Summit in Los Angeles last month, tech leaders from Blackmagic Design, Canon, Grass Valley, Ross Video, and Swiftstack took the stage to address how these groundbreaking tools are changing esports production.

Jared Timmins, VP, advanced technology, Grass Valley, on the impact of “the esports revolution”:
From Grass Valley’s perspective, I think this esports revolution that we’re seeing is one of the most important revolutions that have happened in sports in my lifetime. We see it setting the stage and taking a different direction for how sports can be produced in the future. If [traditional] sports don’t become a lot more like videogame engines in the next 10 years, then we’ve done something drastically wrong. All of the opportunities opening up in esports — for different metadata environments, different distribution [in terms of] being digital first, having a strong social component — factor into the [traditional] sports world. I see esports blazing a trail that will benefit traditional sports for years and years to come.

Larry Thorpe, senior fellow, Imaging Technologies and Communications Group, Professional Engineering and Solutions Division, Canon U.S.A., on managing the challenge of LED-heavy staging in esports:
The LED [displays on-stage] are a big challenge for esports. We need to know the pixel count because they can vary all over the map and you want to avoid the moiré interference. Today, we have cameras that are 4K, 6K, and going up. Super-sampling and down-sampling can help control that, and you can eliminate side bands more easily. There is also a range of illumination from the dark to the extreme bright. Of course, the cameras are astounding today with their sensitivities, and lenses are much faster. I also see an awful lot of smaller cameras in esports, and I see a lot of tight shots. I think that certainly says there’s a role for prime [lenses].

Cameron Reed, business development manager, esports, Ross Video, on the value of producing esports content in UHD and HDR:
I think it’s an important question to ask yourself where you’re transmitting your signal. If you’re sending it to Twitch, you can put a lot of work into UHD and HDR, but none of it matters, and you wasted every penny because Twitch is just going to downscale it and convert it into their own [format/resolution]. Even their 1080p isn’t really a true 1080p because 720p and 1080p are using the same bitrate on their platform. This is an important question for production engineers. Until Twitch gets there, esports doesn’t seem to have much of a reason to go [UHD or HDR] for their live stream.

David Hoffman, business development manager, Blackmagic Design Americas, on the importance (and challenges) of producing esports in high frame rate:
[In esports], you’re always thinking about the athlete, and the athletes want to have the highest frame rate possible … because they can perceive changes in the game when the frame rates are not as high as possible. But, when we get into our traditional broadcast environment, we have to scale that back to standard 60 fps.

How do we do that? Do we take it prior to the game [feed], or do we take it out after the game [feed]? It’s a tough call because we don’t have to do multiple frame rates in other sports; everything is standardized for the entire broadcast. Here we’re talking about the athletes themselves being part of the presentation; that happens in no other sport. You’re pulling [the video feed] off the engine itself, so we need to pull that from them at the same time we are presenting it to them, and that gets very difficult without impacting performance.

Esports Production Unwrapped: The Technology Behind the Show panel: (from left) SVG’s Ken Kerschbaumer, Swiftstack’s Vince Auletta, Blackmagic’s David Hoffman, Ross Video’s Cameron Reed, Canon U.S.A.’s Larry Thorpe, and Grass Valley’s Jared Timmins

Vince Auletta, director, media solutions, Swiftstack, on the benefits of on-premises object storage over public cloud for esports-content creators:
Once the broadcast is complete, you have all of your video assets. Swiftstack is the on-premises or multi-premises repository where all of those assets live for the long term, as opposed to the public cloud where you pay egress fees. One of the big reasons to use on-premises object storage is that you don’t pay for egress. It ends up being a lot more cost-effective and more performant. And it can also span multiple sites natively, so you get integrated DR. If your primary site goes down, your backup site takes over automatically without any kind of intervention.

Auletta on how on-premises object storage opens up new possibilities for the production team:
There are a lot of new possibilities that are unlocked, thanks to having on-premises object storage. You can do things like use AWS or Google for transcribing all of your video assets automatically. And all of those transcriptions will be put back in the media-asset–management system or back in the object-storage system and tied to those assets natively. The power of metadata is a big part of that, as well as the power to use cloud compute with storage on-prem.

Timmins on the importance of building more-flexible production technology for esports:
I think one of the great challenges of esports is the number of syndication elements, different formats, different codecs, different resolutions, and frame rates is at a level you don’t see anywhere else in production. [We need to be] able to build next-generation engines that are frame-rate–agnostic, HDR/SDR-agnostic, and can build these workflows so that you can condition the signal to whether it’s going to a large screen for someone at home or someone watching on their iPhone. Distribution mechanisms and having that agnosticism about formats and capabilities is critical.

Reed on how esports is pushing production-tech manufacturers forward:
I think that the two ways that esports has really pushed the technology manufacturers has been in signal conversion and audio mux, as well as data integration. Your A1 is going to have a really, really hard day if they are giving audio control to every one of those players on the system itself, which is how [the players] are used to doing it at home. You have to take [the player’s] embedded signal, de-embed it, send it back to them, allow them to fiddle with it on a little mix amp, which then has to come back again to the director.

The other area has been in data integration. Literally everything in a videogame has a data point, and it happens so quickly that a statistician can’t keep up with it. Companies are coming up with ways to automate some of these graphics, presentations, and effects, and they are going to have a lot of success in esports.

Hoffman on creating new esports-specific production tools for observers (in-game camera operators):
These [production] environments are completely different than traditional sports. The observer role is a completely new paradigm in the control room. Observers have to be players, and they have to know the game to be able to anticipate. How do we give them traditional broadcast tools that are relevant to them? They want to focus on the game, but they have to have those tools to be able to present content, so we have to come up with new tools for them.

Reed on how the esports-production professional differs from traditional sports:
To design an [esports-production] system, you really need a very experienced person — someone who comes with a deep understanding of the technology, how it works, how to convert those signals, how to get it all into a production environment. A replay operator [from traditional sports] is going to come in and look at Fortnite and [not understand] what the producer’s asking for. So you need to have a replay system that works for the 20-year-old who knows the game and knows exactly what angle the producer’s asking for.