SMPTE 2017 Conference Addresses Practical Concerns Surrounding SMPTE 2110 Standard, UHD
Annual gathering of entertainment engineering pros explores the recently approved standard, 8K acquisition and processing, and much more
With temperatures soaring outside, the Society of Motion Picture and Television Engineers (SMPTE) tackled the hottest topics facing the entertainment industry today — most notably, the recent approval of the first standards within the SMPTE 2110 suite for professional media over managed IP networks. Advances in UHD acquisition and processing; content management, storage, and security; virtual, augmented, and mixed reality; and many other topics also had their moment in the sun during this week’s annual Technical Conference and Exhibition in Hollywood.
“We are very excited about this,” said SMPTE President Matthew Goldman in his opening remarks on Tuesday. “The first few parts of the SMPTE 2110 standards suite have been approved. Going forward, it’s going to change the industry in a way that we’ve seen previously [with the transition from] analog to digital and from tapes to files. … Many of the media industry’s leading vendors will show attendees that carrying video, audio, and the ancillary data over IP networks is not only possible but practical and realizable, thanks to the SMPTE 2110 standards.”
An in-depth discussion of the current state of the standards suite took center stage during the Media Infrastructure session on Tuesday. Imagine Communications’ Leigh Whitcomb presented an aptly timed paper titled “Is SMPTE ST 2110 the New Standards Superpower?” to a standing-room-only crowd, followed by Evertz Microsystems’ Paul E. Briscoe, who discussed the issues of synchronization and timing in SMPTE 2110.
“[SMPTE] 2110 meets the requirements that we need to move forward,” said Whitcomb. “It’s an open standard; the standard is very, very close to being fully published; [and] we have a high degree of interoperability. I think 2110 will be the next super-power standard that we’ll have for a long, long time to come.”
Two sessions focused specifically on advances in UHD, the first delving into display technology and the second examining acquisition and processing. Among the presentations in Tuesday’s Advances in Display Technology session was an inside look at overcoming the engineering challenges of broadcasting live UHD content from the International Space Station, presented by Rodney P. Grubbs of NASA’s Marshall Space Flight Center and Sandy George of Science Applications International Corp. (SAIC). Wednesday’s UHD, Bigger, Faster, Better — About Acquisition and Processing session explored recent technological developments in UHD and the impact of high dynamic range, extended color space, high frame rates, and more.
“UHD TV really [has come] a long way,” said session Chair Dr. Hans Hoffman, head of media fundamentals and production technology for the European Broadcast Union, in his introductory remark. “Years ago, we were talking about the very fundamental research topics. Today, what you’re going to see in this session is, we’ve reached our practical, operational questions. We see it’s coming out of the innovation curve and coming into the operational curve, and that is good to see: a new television system entering the market, being accepted in the whole end-to-end chain, which, at the end of the day, provides a new experience to the viewers at home.”
Interestingly, on Wednesday, the second of the two-part UHD, Bigger, Faster, Better session focused on advances in 8K and provided very different viewpoints on the technology’s future. NHK’s Tomohiro Nakamura, Takahiro Yamasaki, Ryohei Funatsu, and Hiroshi Shimamoto collaborated on a paper titled “An 8K Full-Resolution 60-Hz/120-Hz Multiformat Portable Camera System,” detailing the Japanese broadcaster’s current work with Super Hi-Vision and HDR for live coverage of such sports as soccer, baseball, and sumo wrestling. Conversely, a paper titled “Beyond 4K: Can We Actually Tell Stories in Motion Pictures and Television in 8K?” by Creat3’s Pierre Hugues Routhier laid out the limitation of 8K from a cinematography perspective.
“For a medium or a medium-wide shot,” he explained, “even at 120 frames per second in 8K, if you’re walking or something is moving faster than one-tenth of a normal walking pace, it’s going to be out of focus on most shots. That means your camera needs to be locked off, and you need to have a very static environment. It’s beautiful in 8K, but it’s very hard to tell a story with a locked-off camera and people not [moving].”
Discussion on creating, transmitting, and displaying high-quality content offered an in-depth look at content security and how to maintain positive control of media. Callum Hughes of Amazon Studios discussed security within a digital-asset–management (DAM) system, arguing that, while a cloud-based approach has many benefits, it must be considered no different than an on-premises solution. During the session on stream privacy, Ericsson’s Raj Nair described mechanisms for guaranteeing stream privacy for both OTT and live/linear adaptive-bitrate (ABR) workflows.
“Frankly, there is no real difference to secure your content in the cloud [vs.] your data center,” said Hughes. “The big misconception is that the cloud is like a panacea that’s going to solve all your problems and that everything that you’re kept awake at night thinking about will magically go away. That’s not true. You have to consider the cloud to be just like your data center.”
Conversations around advances in immersive storytelling — virtual reality, augmented reality, mixed reality, and 360 video — took place throughout the four-day conference. But an Opening Keynote by VR/AR/MR expert Andrew Shulkind on the future of immersive storytelling – both its obstacles and its opportunities — defined for attendees the potential impact that these technologies will have on Hollywood.
“We have a responsibility to bring our pedigree and our influence to this nascent market,” he said. “What film, television, and advertising have done so well — pulling at the heartstrings and making us care about individual stories — immersive media can do even better. Making successful, interactive, quality programming is central to creating the kind of empathy that can change the world. This doesn’t mean every movie is now in 360 and you have to program every script with a branch of narratives in every direction. But it means that storytellers, directors, writers, content makers, advertisers have more-effective tools than ever before to motivate viewers to connect with characters and have alternative [experiences].”
The SMPTE 2017 Annual Technical Conference and Exhibition drew more than 2,500 registered attendees to Hollywood and featured 70 expert presentations and 105 exhibitors in two nearly sold-out exhibit halls. Prior to the conference’s official start on Tuesday, the SMPTE 2017 Symposium provided a day-long look at “Artificial Intelligence (AI) and Machine Learning (ML) in Digital Media Creation: The Promise, the Reality, and the (Scary?) Future,” co-chaired by SMPTE Fellow Michelle Munson, co-founder of Aspera, and Yvonne Thomas, product manager at Arvato Systems.