Live From Tokyo Olympics: OBS Head of Engineering Isidoro Moreno on the Move to UHD, HDR, and IP
As Head of Engineering for OBS Isidoro Moreno knows his tech. And at the 2020 Tokyo Games he is drawing on all his skills as well as those of his team. Why? Because not only are the Tokyo Olympics the first ever to be all IP but it is also all UHD, all HDR, and all Immersive Audio with 5.1.4 channels. Moreno discussed the new developments and more with SVG at the IBC earlier today.
Transitioning to UHD, HDR and IP all at once is a pretty big lift. How has it been going and what have you learned?
We are learning a lot and it’s an opportunity for us to streamline our workflows. Here in Tokyo the number of new projects we have started is huge and the number of services we offer to rights holders has increased by about 50 percent since the Rio Olympics in 2016. And those new services can be only achieved by applying new technologies.
For instance, in the IP world we have increased the capacity between the venues and the IBC. In London there was just one video feed per fiber and things were extremely big because you needed a lot of lines. But now with IP we can create trunks and aggregate services on a single fiber. And we also have redundancies we didn’t have in the past as we can have two services and have a guarantee against failure.
Also, the transition from HD to UHD is gradual [for rights holders]. We have to separate the UHD signal from the HD signal in order to stay consistent with the service level we had in the past. We cannot force everyone to change to a new format, so IP is useful in offering both HD and UHD services.
At the same time, we also upgraded our Internet services so that, for example, athletes in the mixed zones can reach their own country using the Internet.
I was speaking with Dave Mazza at NBC, and he said it is amazing that you were able to find enough UHD trucks and facilities to go all UHD. How did you do that?
It was one of the bigger challenges and how to find enough trucks to do UHD was one of the first studies that we did. We wanted to figure out if we could do 100%, 50%, or whatever and at the beginning it did not seem possible to cover 100%. But we had contracts with companies that were about to build trucks and we were counting on them being built for UHD.
At the Opening Ceremony we used a UHD flyaway system that was designed specifically to do the opening ceremonies and it was based on ST-2110 IP. There is an opportunity to develop new systems that can be used for future games.
Did the one year delay give you a chance to change anything?
We went to the market looking to see what could help us fulfill our needs. In UHD we saw a mature market that offered us what we were looking for. We had to offer services like splits in the cameras, replays, super slo mos, and RF systems so we were investigating how to upgrade those to UHD.
So, out of the 1,049 cameras we are using, very few are natively 1080p. And one thing we have learned [in tests] is that using native 1080i equipment and upconverting to UHD did not create content at the level we wanted. But when you start with 1080p and then go up to UHD and then down to 1080i there is a much better result.
One important decision we made, which was risky, was that in the past experiments were done in parallel. For example, in 1992 in Barcelona the Olympics HD production was a completely parallel production as we didn’t want to interact or interfere with the main production.
But here we are doing a single production with maximum UHD quality. All the elements are fully native UHD or [upconvert] at a good level. But we wanted to have full athletic coverage with the same number of cameras we had in the past and that has been super helpful to broadcasters.
How about working in HDR?
Once you understand HDR it’s a big improvement. When we were talking about UHD SDR it was more pixels, bigger resolution, but we didn’t explore the whole world of colorimetry and wide color gamut.
We studied how to make HDR compatible with up and down conversion as our main product is 1080i SDR. So, we have three transforms that keep the color spectrum of UHD as much as possible within the 709 color space used for SDR.
With things like country flags in graphics the colorimetry is important. We want to maximize the quality and user experience in HDR without compromising 1080i with things like shifting colors that are not correct. So, we created our own set of Look Up Tables (LUTs) that broadcasters can use free of charge. We want to make our pictures compatible with the personalized pictures they create with their own cameras.
It’s been a complex process and we are collaborating with broadcasters to better understand their needs. We have worked with NBC, BBC, NHK and the rest and we are really happy that we are working in the same direction. At the end of the day that is our mission: not what OBS wants but to help the broadcasters.
The past year has seen a lot of interest in live production in the cloud. What is your take on the cloud and what it means to the future of production here at OBS?
We need remote production from broadcasters for many reasons. The IBC cannot keep growing and now they can receive content back home that in the past was limited or, if something was produced at the Olympics, it could not be transferred back home. And broadcasters demanded it even more with the pandemic as it was critical for people to stay home as with social distancing a space that once had 500 people could not hold 500 people.
That forced us to start thinking ahead to solve issues of remote working and the cloud system from Alibaba allowed us to have a platform from where we could make services available to broadcasters. We have new services like archive, where we ingest 9,000 hours of content and make it available to broadcasters.
In the past to get access to content was kind of difficult because you had to have a machine that was remotely connected with our video server, had to use your own pipe, etc.
But now we are uploading low-resolution proxies in mezzanine format which is perfect for digital platforms. And we opened the archive worldwide via a Web browser so users could search for content in our Content+ platform. In the past we only hosted clips we produced but now it hosts the whole archive: live sessions, feature clips, musical clips, it’s all available on the cloud.
We are also doing a test for a cloud-based VandA where rights holders can load an application, switch between streams, and then receive a UHD stream through the Internet at 90 Mbps. It’s been quite stable and it’s very interesting.
Last question. The Beijing Games are only six months away. I am assuming you won’t be looking to do any major changes.
We want to keep things the same in Beijing as much as possible but obviously we will refine something if we need to. But it’s a special situation as usually we have a year and a half and now, we have only months. We do have some things in our road map, but we can’t apply them in Beijing as it would be too risky.