Live From Tokyo Olympics: OBS CTO Sotiris Salamouris on the Move to UHD, HDR, IP, and Immersive Audio
When the 2018 PyeongChang Winter Games ended, OBS CTO Sotiris Salamouris and his team laid out an ambitious vision for the 2020 Tokyo Games. Not only did they want to transition from SDI to IP, but they also wanted to go all UHD and HDR.
It was an ambitious plan, but the team has pulled it off, despite a pandemic-mandated delay that may have allowed more time for testing and development but also meant less time to get ready for the 2022 Beijing Winter Games. Even with multiple years to plan, there is little in the way of wiggle room because there is always another Olympics to plan for, new technology to embrace, and new projects to launch to better meet the needs of rightsholders.
When the Games were delayed by a year, he notes, it gave the team a bit more time to do testing with broadcasters. But the team tried hard not to change the scope of what they were looking to accomplish as it dealt with other issues, including logistical challenges and, of course, the pandemic.“When the postponement happened,” says Salamouris, “we already had about 150 people here in Tokyo. It was a project in and of itself just to figure out how to freeze things and then get people back home. We needed to deal with the situation here, coordinate with Beijing, and then also deal with new requirements from broadcasters, who were all concerned with how they would work remotely and their space inside the IBC.”
“It was difficult,” says Salamouris, “because, in order to be ready for the event, we have a very well-coordinated plan. And we had two areas of difficulty. One was international logistics, which were also affected by the pandemic with frequent flight changes and cancellations. In addition, as though one bad option was not already enough, sea transports were experiencing challenges due to the overall backlog created by the Suez Canal blockage some time ago.”
And then there were pandemic issues, he adds. Because OBS brings thousands of international staff in to work the Games, the Japanese authorities and the Organizing Committee established a thorough regime to safeguard the health and safety of the international personnel and, of course, the local population. This famous “playbook” mandated rigorous testing and other measures to minimize infection in the Games environment.
Salamouris notes that there were some disruptions in OBS operations. Contact tracing on the plane to Tokyo, for example, required some personnel to quarantine, making them unavailable to work. “We had Plan B’s to address such eventualities,” he says. “It is totally impossible to fully predict what kind of impact you may have if someone from your personnel, who have very varying and sometimes unique skillsets, may need to quarantine.
“We have the playbook,” he continues, “and people were tested and tested. The reality was that the percentage of positives found was extremely low. However, there were cases where people may have been in the vicinity of suspicious cases or even positives and then had to quarantine. Suddenly, you have people that are essential to your team disappearing, sometimes for two weeks.”
Despite the travails, dozens of rightsholders, thousands of production professionals, and thousands of athletes and volunteers are onsite for the second week of the Tokyo Games. And, although technical innovation may be taking a backseat to the ongoing concerns around COVID-19, it’s important to look at some of the innovations that make these Games arguably the single most impressive technical achievement in the history of sports production.
Embracing the Cloud
As rightsholders reshaped their plans for the Tokyo Games, the OBS efforts around cloud-based services started to become more important. “They became a priority,” says Salamouris, “as they would allow rightsholders to do more remotely and operate from wherever they are located.”
OBS cloud-based services are built on the Alibaba cloud platform, and Salamouris says it would not be easy to do without their support. “You need specialized support to do this, and we are able to get Alibaba’s attention.”
The more popular cloud services include Content+, Content+ Extra, and Live Cloud, although there are several others that OBS had developed over Alibaba’s public cloud for either its internal consumption or delivery to the RHBs. All three have become increasingly important because of the pandemic.
“We’ll produce more than 9,000 hours of content, 5,500 hours of which is live competitions, ceremonies, and other scenes and content from the venues,” says Salamouris. “The rest is postproduced. The point is, how do we make that accessible to broadcasters?”
The two main ways for rightsholders to access all that content is Content+ and Content+ Extra, which are basically the same service but with different access rights. Content+ gives access to all the postproduced content, such as features, interviews, and highlights. A file-based system, it gives the user the ability to download an entire clip or part of it or even do some editing in the cloud.
“Content+ Extra is the same things but with access to growing files for competitions as they are happening,” explains Salamouris. “You can browse it and clip whatever you want and download it while the session is still on. You can’t feed it to distribution, but you can build your own highlights once you select what high-resolution file [is] sent to you.”
For rightsholders looking to do cloud-based distribution, there is Live Cloud, which makes all the video and audio signals available via IP packets streamed over the public internet. The whole process is controlled by cloud-based applications built and made available by OBS.
“They can select whatever they want from our available 75 HD and 46 UHD distribution channels,” Salamouris explains, “and they can get it wherever they are in the world over the public internet. The amazing thing is that we have confirmed that we have four broadcasters that are using this to transmit UHD.”
The signals are available at 100 Mbps per UHD feed, exceeding even the compression specs for UHD, which are common via satellite.
“It goes from one part of the world to another with no packet loss, no breaks, and with latency that is similar to satellite transmission,” he points out. “That is great news, because one of the big things about UHD is the cost of getting the signals back home. This is a very cost-effective way of getting as much UHD as you want.”
Why IP Matters for UHD
The move to IP has been intense, interesting, but ultimately very successful, Salamouris says, because OBS was able to combine the move to IP with the move to UHD. Native coverage is UHD with HDR and wide color gamut (WCG), and an HD SDR version is derived from that UHD HDR production.
“We wanted to move to UHD,” he notes, “and we knew that we could not scale with a standard quad SDI workflow for the volume of content we wanted. Since we are using a substantially large fleet of existing OB units but also fly-away systems (31 OB vans and 22 fly-away systems, quite often with multi-feed outputs), it was impractical and unnecessary to impose the exact type of internal technology that these systems could use. Many of those had migrated to IP, but the majority were still based on quad SDI, ‘legacy broadcast’ technologies for their internal signal routing.
“We had no issue with that,” he continues, “as long as they were engineered to support our UHD workflow, including, of course, our expectations for capacity and resilience. Each production unit, however, had to deliver a double UHD HDR version and a double HD SDR signal in parallel paths. From this demarcation onwards, HD and UHD followed independent paths and were based on totally independent technology stacks. We used our legacy contribution and distribution systems for HD, but, for UHD, we moved fully to IP.”
The UHD contribution, hence what it takes to move UHD content from the venues to the IBC, is a combination of technologies based on SMPTE ST 2022, which Salamouris says has several advantages over ST 2110 at this stage.
“Within the IBC,” he notes, “all the signal routing and distribution is based on an SDN infrastructure carrying ST 2110, as is all the signal monitoring. We also have PTP timing that has worked very well, and we’re surprised at how robust the whole thing has been when it comes to networking. We’re very, very happy to establish ST 2110 and PTP as our basic technology.”
The production efforts for the Tokyo Olympics are massive, deploying more than 1,050 cameras. About 70% of those cameras are broadcast, CCU/BPU-supported native UHD; the remainder are SDI-output cameras, mostly in native UHD but a few also in 1080p (there are no 1080i camera sources).
“An important innovation from our side,” he says, “was the combined/common live workflow for UHD HDR and HD SDR. We very soon realized that we had only one option for introducing UHD in the Olympics: to build a unified workflow that will be delivering both UHD and HD from the same higher-quality format, which of course could only be UHD in BT.2020, with HLG HDR and WCG. Of course, such a single workflow will always need to guarantee a premium quality in HD, since this is the format that the great majority of the world broadcasters still use. UHD had also to be visually implacable; otherwise, its introduction would not make sense at all.
“To achieve all these quite aggressive and challenging goals,” he continues, “we had to develop a unique workflow that had to deviate substantially from what had been so for the more common approaches of producing UHD with HD for live sports. We ended up developing three types of our own HDR look-up tables. We realized that, because our own needs are very specific we had to create our own conversion tables.”
Each of the look-up tables (LUTs) has a specific purpose. One takes existing SDR sources, such as archival material or specialty cameras, and places them into the HDR domain. A second table is specifically designed for graphics needs. And then a final table helps convert UHD to HD.
“We incorporated those LUTs in converters in the trucks, or the trucks themselves have their own ability to program the LUTs into equipment like the vision mixer,” says Salamouris. “But it was a very long process with very extensive testing and a rigorous certification phase every time that we were deciding to use existing truck resources.”
The result is a single workflow for all 50+ production units.
“They all have exactly the same workflows regardless of which sport they are doing or what venue,” he explains. “That has proved to be nice, as we have had zero issues with our UHD HDR output and our HD SDR output. It’s the same picture but enhanced in resolution due to 4K, brightness highlights wherever they exist (this is due to HDR), and color fullness wherever it exists (and this is due to WCG), which is what it should be.”
Each truck also has a visual expert who works with the shading team and is also in contact with a centralized VQC in the IBC, where experts make sure that the results across all the sports are dialed in similarly.
Along with improvements on the video side has been the move to 5.1.4 discrete immersive sound, which adds four channels above the listener to provide a sense of height. According to Salamouris, 5.1 surround sound, despite being available for some time now, has not caught on with viewers because it requires placing dedicated speakers around a room. He believes soundbars could change the equation, especially with 5.1.4.
“The technology in soundbars has developed so much and they are so sophisticated that they are close in quality to a dedicated surround-sound system,” he explains. “We think it is now worthwhile to enhance audio production. It makes a difference with the sound space on top of you, especially in sports, where you want to feel like you are there.”
Distributing as many as 4,000 microphones across the venues, he notes, enables the audio mixers to get sounds from close to the action but also makes things more complicated.
“The sound engineer has to really be able to follow a live event,” he says. “At any moment, there could be something that would give the audio some punch. And, with less people in the stands, there is more-dramatic sound reproduction.”
Coming into the Games, OBS and the production teams at the venues were armed with audio recordings of specific crowds to fill the empty spaces. But there has been a presence, whether it is the production teams or fellow teammates, coaches, and media.
“Clearly,” says Salamouris, “it’s a different experience in the venue. But there is a presence, and you can hear the more nuanced sounds of the sport itself. Although there were a lot of preparations to use virtual fan sounds, it hasn’t been necessary. We had it all prepared, but, in the end, we’ve had a good environment in terms of the existing sound.”
A Smaller, Greener IBC
For more than a decade, OBS has been working hard to figure out how to make it easier for rightsholders to have a smaller onsite presence. The pandemic caused that to happen in terms of personnel, but the physical space is also smaller, around 40,000 sq. meters vs. 50,000+ sq, meters in Rio in 2016.
“It has been engineered to make that happen,” Salamouris explains. “One important way is the consolidation of all the technical spaces in our CTA, or Centralized Technical Areas. It has helped a lot with the overall efficiency of space needed but is even more important in providing efficiency in the cooling and the power system.”
Before development of the CTA, each broadcaster’s space within the IBC would include not only a production area but also a machine room. The machine rooms tended to be as small as possible, so that the production team had as much space as possible. But tightly confined racks of equipment require plenty of air-conditioning to keep the gear cool. And, with machine rooms located throughout the IBC, that meant powerful air-conditioning throughout the facility.
“IBCs were often very cold inside, and people would be forced to wear a jacket,” says Salamouris. “But here the conditions are far better because we essentially have data centers with very robust and specialized cooling, along with very specialized power distribution. Everybody is sharing the same space, and each broadcaster has their own cage.”
The CTAs and all the efforts by OBS to create cloud and other services are also designed to shorten the setup time for an IBC that is creating more content than ever and in more-complex ways. Over the past 20 years, the time to set up the IBC has remained static. Salamouris and OBS hope to change that.
“We are always given just some weeks or some days to set up and do our job,” he says. “That is why the cloud is important to us: you can set up a workflow at any time and test it for months before the [event]. You can commission it, switch it down, and then switch it up before the Games. That is a concept that works for us.”
The Tokyo Games still have a few days left (and then there are the Paralympics), but upwards of 80 OBS team members are already in Beijing, working on the 2022 Winter Games, which are six months away.
“They have already started building the IBC in Beijing,” says Salamouris, “and the size of the team is going to grow during the next week.”
Winter Games typically have two IBCs: one in the mountain cluster and one near the indoor venues. The Beijing Games, he says, will feature the largest Mountain Cluster IBC.
“It will be more than 5,000 sq. meters because many broadcasters have decided to be there because the snow sports happen to be more popular with their audiences,” he says. “We will also have a big number of studios up there with some very nice views of the ski jump.”
Reflections on Tokyo
“Of course,” says Salamouris, “it is still very early to celebrate or even congratulate ourselves [on the Tokyo Games]. Every single moment of live broadcast under the pressure of a major event represents a big challenge; many things can go wrong, and you always need a good portion of luck to get through unscathed, regardless of all the best preparations and plans.”
He notes there is never an inappropriate moment to express gratitude to the OBS engineering and technical team that has worked so hard and for so long to make this big, complex technical project a reality. “The pandemic has imposed another layer of difficulty, quite unprecedented, which proved incapable to bend the determination and moral of our fantastic people, even if some of them had to quarantine when we returned to Tokyo to begin again our technical install in March.
“As always,” he continues, “it is a people’s achievement. We are constantly indebted for their love to the Olympics and their determination to always deliver the best, despite any adversities.”