Super Bowl LIV

Live From Super Bowl LIV: THUMBWAR’s Michael Drazin on a Historic HDR Workflow

Plenty of planning, regular-season testing lead to 1080p HDR production

The first UHD HDR Super Bowl production is only hours away, and with it comes a new era in Super Bowl broadcasting. Like many firsts, the UHD HDR production of Super Bowl LIV is the result of months of planning and, arguably, years of experiments done not only by Fox Sports but an entire industry.

Key to the Fox Sports efforts is a plan overseen by Michael Drazin, VP, special projects, THUMBWAR. The production format is 1080p HDR, with the HDR portion relying on the HLG BT.2100 (hybrid log-gamma) format, which was developed by the BBC and NHK.

“The goal of what we are doing,” he says, “is to make it so that production should have no idea they are working in HDR except that it looks better; all of the elements that are available to an SDR show are available to a HDR show.”

THUMBWAR’s Michael Drazin is playing a key role in Fox Sports HDR Super Bowl efforts.

That is easier said than done as it requires plenty of planning. For those who are looking to try an HDR production, Drazin says that is where it all begins.

“The biggest thing is planning and understanding what you want to do, how you are going to do it, and why,” he says. “And you have to make that plan before you get to the parking lot. If you have a plan in place when you get there, you can execute. But you really need to stop and think through the entire production.”

For Fox Sports, that planning began in the preseason and followed through the entire regular-season schedule of Thursday Night Football contests. Those preseason and regular-season contests allowed not only Fox Sports but also Game Creek Video and other partners to sort through workflows. The workflow begins by using only the HDR signal coming out of the cameras.

“All the HDR cameras are set in HLG BT.2100, and we are taking only HDR out of the CCUs,” says Drazin. “We are not using any SDR from the CCUs.”

Any source that originates as SDR is run through an AJA FS-HDR real-time HDR/wide-color-gamut converter. The production is using BBC lookup tables (LUTs) to tone-map signals from SDR to HDR. The tone-mapped signals are then available via each truck’s router and to all replay devices, monitors, and the switchers. That allows cameras like the PylonCams and other specialty cameras to be “tone-mapped” and integrated into the production with the HDR cameras.

“BBC and NHK are the founding fathers of HLG,” says Drazin. “The BBC’s lookup tables follow the science behind HDR and are available under license to the industry. We have seen many manufacturers and broadcasters license their work, and it has become the de facto standard. The team at the BBC has done excellent work and has actively collaborated with us as they have continued to improve the color science.”

There are about 100 AJA FS-HDRs across the Fox compounds between Hard Rock Stadium and South Beach. In 1080p, each converter has the ability to handle four signals, including a color corrector where needed.

“Having all of the handoffs in HDR between the trucks and sites streamlines the entire operation, as there aren’t different iterations of signals floating around,” says Drazin. “That was a decision we made months ago, and it has made things efficient and is paying dividends.”

Making Things Cohesive
One challenge is bringing all the different elements into a single cohesive production. The industry has learned that two types of conversion are needed to get from SDR to HDR. SDR cameras use a scene-light conversion, in which the science closely resembles how a camera sensor works. HDR graphics use a display-light conversion. A final product, like a graphic or a video element, is tone-mapped as a display-light conversion. Currently, the LUTs are doing what is known as a direct conversion, where the SDR is mapped into the HDR space without any expansion taking place. This enables the SDR content to make a roundtrip, going through the HDR production and then returning as SDR.

“Utilizing two different LUTs to get from SDR to HDR, as well as the 75% anchor point, we can build a cohesive production in both HDR and SDR,” says Drazin.

Although the entire production benefited from the regular-season contests, the video operators, who are responsible for shading the cameras, had an opportunity to see how HDR enabled them to be more creative and also see how their work looked when returned to SDR via inverse tone mapping.

“Working with the video teams,” says Drazin, “we learned that they preferred to shade in HLG on an HDR monitor with an SDR monitor next to them with a predictive downconvert. This enables them to see how it will look when the signals go to air. We have seen the video teams quickly become accustomed to this workflow.”

The goal with the inverse tone mapping to SDR, he adds, is to maintain the shader’s artistic intent.

The other big question facing an HDR production is monitoring for the entire team. Do you need to replace all the SDR monitors with HDR-capable versions, or can most of the team get by fine looking at HDR in an SDR monitor? For Super Bowl LIV, all the multiviewers in the compound are SDR monitors that will input HDR signals, and the preview and program monitors in the production trucks are inverse-tone-mapped so the SDR is what is going to air.

Improvements Ahead
Drazin says the current state of HDR is similar to the early days of HD when the 4:3 needs of SD viewers trumped the potential ways the 16:9 screen could change the nature of production.

“Right now,” he notes, “we are protecting for SDR like we used to protect for 4:3. When we reached a tipping point in the market, the graphics slid out to 16:9, and we don’t even think about 4:3 anymore. How long will it be before we get to that point in HDR? Is it six months, a year, or 18 months? I