Inter BEE 2017 Highlights 8K, 360 Tech
NHK offers peek at production unit for 2018 Winter Olympics coverage
The 2017 Inter BEE Conference in Japan highlighted technology developments on the show floor and also featured a mix of Japanese- and English-speaking panelists discussing the future of the industry.
Not surprisingly, NHK’s efforts around 8K were of interest around the show floor and in the sessions. Yuji Yamana, senior manager, Planning Division, Engineering Administration Department, cites 8K cameras offered by such manufacturers as Ikegami, Sony, Astro Design (which introduced an 8K camcorder at the show), and Hitachi as examples of manufacturing momentum. It’s the kind of momentum that, in a few years, will lead to increased competition in the marketplace, which will spur development of TV-production-equipment features and also drive costs down.
“We will start satellite broadcasting of 8K beginning on Dec. 1, 2018,” says Yumana. “Now we are preparing our equipment, productions, and transmission [facilities].”
NHK’s booth at the show was highlighted by a full-size 8K remote-production unit. NHK will have two of the units at the Winter Olympics in PyeongChang in February, and they will be the backbone of eight hours of live 8K/HDR coverage each day.
“We will have a total of four OB 8K trucks by the end of 2019,” he adds. “We also have three audio trucks and are building 8K studios inside NHK headquarters.”
The workflows are currently baseband because the reliability of IP 8K workflows is still not there. But the next trucks are expected to have a mix of IP-based and 12-Gbps SDI signal transport. In addition, compression formats like Tico and VC-2 are under consideration in order to lower the current 48-Gbps data rate per signal.
One of the exciting developments on the floor was a demonstration of a product at the Canon booth with a working title of Free Viewpoint Video System. The system is the manufacturer’s effort to create an alternative to the Intel freeD system. Both systems require more than 30 cameras to be placed around a stadium or arena to capture 360 degrees of images of the action on the field. The goal is to capture enough information to allow an operator to create a replay that can show what happened from all angles via a virtual camera that spins around the action.
The Canon approach is different from Intel’s, placing a greater emphasis on creating 360 replays that have moving video rather than a freeze frame.
“The system uses 30 cameras around the stadium to shoot the game simultaneously, and synchronization is very important,” says Tsuyoshi Wakazono PhD, assistant manager, Canon SV Business Development Project. “Once we have those images, we create 3D models of the players and objects, and then we can paste the images around those frames.”
The result is a full-motion–video replay that allows a virtual camera to sweep around and dive anywhere into the field of play. The demonstration at the booth featured shots from within an actual soccer game that seemed to have been captured by a camera operator running around the pitch with a Steadicam, weaving in and out of players, and ultimately having the ball hit straight into the camera.
The technology is still very much in its early days, and rendering out the replays takes a day, but the Canon team is hard at work looking to reduce that turnaround time to make the technology more practical.
A Focus on Storage and Global Delivery of Multiple Versions
Among the highlights off the floor was a presentation by BBC Head of Production Standards Andy Quested on the Digital Production Partnership (DPP) and the use of IMF to streamline storage of programming destined for delivery around the globe via broadcast, cinema, or streaming.
“You need a high-quality mastering format that can be extended beyond a three- to five-year window,” he explained. “Different versions of the same content are our future, and, as a public service, we are under pressure to produce more and more content for less and less money. And the audience expects content delivery on any device.”
He noted that the 50th-anniversary episode of Dr. Who was delivered in 80 editorial and technical versions around the world. There were six core versions, three in 2D at 1080p and different frame rates, and three in 3D HD — each delivered in five languages and various (and large-bandwidth) cinema versions. The storage required by all those formats was very large because the episode was 77 minutes long. The TV versions required 4 TB of storage, the cinema versions 67 TB, for a total of 71 TB.
“And 90% of the content is repeated. There had to be a better way to maximize efficiency,” Quested said. “So the BBC, along with DPP and the EBU, looked at creating an Interoperable Mastering Format, or IMF.”
IMF is a business-to-business high-quality mastering format in which a program is divided into components (opening credits, the show itself, end credits, etc.) and an XML document is used to bring those components together to create a “composition playlist” that makes it easier to deliver more-personalized content to a TV, cinema, or streaming distribution partner.
“The CPL looks like a nonlinear-editing project,” Quested added, “but those systems are proprietary while IMF is a set of standards devoted to mastering and versioning content.”
The impact on storage requirements is dramatic. For a program like Dr. Who, the overall storage needs drop from 71 TB to just under 3 TB.
“That is a saving of 96% and is why IMF is an incredibly powerful tool,” he said. “Our trial pilot for IMF was the series Planet Earth 2.”
The first Planet Earth series was created in 2005, an era when tape-based HD workflows were the only way to create deliverables. Tape-to-tape copying was a nightmare, according to Quested. Ten years later, when Planet Earth 2 was created, IMF and file-based workflows took the place of more-laborious methods.
“When the transmission version [was created], there wasn’t a tape in sight,” said Quested. “And we experimented with a new tool called NEST IMF that exported different versions on demand. Also, if we sent a program and there was a mistake or a shot had to be replaced, all we need to do is send the replacement media and the XML file. It’s a powerful delivery tool.”