The Elephant in the Room: 3D at NAB 2010
As I roamed the exhibits at the NAB show this month, I kept wondering what other year it seemed most like. And I was not alone.
There were plenty of important issues covered at the show, from citizen journalism to internet-connected TV. And then there was the elephant in the room.
It would be a lie to say that 3D technologies could be found at every booth on the show floor. But it was probably the case that there was 3D in at least every aisle. There was so much 3D that it tended to diminish all other news.
In acquisition technology, for example, LED lighting was near ubiquitous, with focusable instruments, such as the Litepanels Sola, sometimes painfully bright. Panasonic and Sony both showed models of future inexpensive video cameras with large-format imagers, and Aaton joined the range of those offering “digital magazines” for film cameras. In small formats, GoPro’s Hero is a complete HD camcorder weighing just three ounces.
In storage technology, Cache-A, For-A, IBM, and Sony all showed in new offerings that tape is not dead. Meanwhile, iVDR removable-hard-drive storage could be seen in several new products, and Canon introduced new camcorders based on Compact Flash cards.
Cinedeck looks like a viewfinder but includes built-in storage and editing capability. NextoDI’s NVS 2525 can copy either P2 or SxS cards.
In processing, Dan Carew’s Indie 2.0 blog said of Blackmagic Design’s DaVinci Resolve 7.0, “this best-in-class color correction software was formerly US$250,000 (for software and hardware) and is now available in a Mac software only verions for US$995.” http://indie2zero.com/2010/04/16/what-i-liked-and-saw-at-nab-2010/ Immersive Media’s 11-camera spherical views can now be stitched and streamed live. NewTek’s TriCaster TCXD850 can deal with 22 inputs and virtual sets. And, though you might not yet be able to figure out why you’d want this capability, Snell’s Kahuna 360 production switcher can deal with up to 16 shows at once.
In wireless distribution, there was VµbIQ’s 60 GHz uncompressed transmitter on a chip and Streambox’s Avenir for bonding up to four cellular modems to create a 20 Mbps channel. In wired, there was Pleora’s EtherCast palm-sized bidirectional ASI-IP gateways. And, in technologies that could be applied to either, there were Fraunhofer’s codec with a latency of just one macroblock line and a Harris-LG/Zenith proposal for expanding ATSC mobile transmission to full-channel use.
In presentation, there was a reference picture monitor from Dolby (seen in almost its final form at the HPA Tech Retreat). Several booths had OLED monitors, from 7-inch at Sony to 15-inch at TVLogic. Wohler’s Presto router has an LCD video display on each button. And Ostendo’s CDM43 is a curved monitor with a 30:9 aspect ratio.
That barely scratches the surface of the non-3D news from NAB. And then there was 3D.
Even All-Mobile Video’s Epic 3D production truck, parked in Sony’s exhibit, wore 3D glasses. But it was the glasses on visitors to the truck that proved more instructive.
Sony provided RealD circularly polarized glasses to visitors for looking at everything from relatively small monitors to a giant outdoor-type LED display. As soon as those visitors entered the control room of AMV’s Epic 3D truck and donned their glasses, however, they saw ghosting — crosstalk between the two eye views. AMV staff were prepared for the shocked looks. “Sit down,” they said. “There’s a narrow vertical angle, and you have to be head-on to the monitors.” Sure enough, that solved the problem — at least for those who could sit.
Another potential 3D problem was mentioned in the two-day 3D Digital Cinema Summit before the show opened. If 3D is shot for a small screen and blown up to cinema size, it can cause eye divergence. 3ality’s camera rigs indicate when this might happen, but it happened anyway on at least one cinema-sized screen at NAB, leading to some audience queasiness.
Buzz Hays of the Sony 3D Technology Center says making 3D is easy, but making good 3D is hard. There was a lot of 3D at NAB, including both easy and hard, good and bad.
It was hard to count the number of side-by-side and beam-splitter dual-camera rigs at the show, but, in addition to those, there were integrated (one-piece) 3D cameras and camcorders, in various stages of readiness, from 17 different brands, both on and off the show floor. It seems that all of them were said to be “the first.”
Much could be learned about 3D at the two-day Digital Cinema Summit before the show opened. It began with Sony’s Pete Lude showing that an ordinary 2D picture can seem 3D when viewed with just one eye, leading a later speaker (me) to quip that watching with an eye patch, therefore, is an inexpensive way to get 3DTV.
3ality’s Steve Schklair followed Lude with an on-screen, live demonstration-tutorial on the effects of different 3D rig settings: height, rotation, lens interaxial, convergence, etc. He was followed by directors, stereographers, and trainers of 3D-convergence operators, among others.
Although 3D would seem to require more equipment (two cameras and lenses plus a stereo rig at each location) and more personnel (a convergence operator per camera in addition to a stereographer), there is seemingly one saving grace. According to Schklair and others, 3D can get away with fewer cameras and less cutting than 2D.
The same thing was said of HD, however, in its early days. Sure enough, when I worked on one show in 1989, we used just four HD cameras feeding the HD truck and twice as many non-HD cameras feeding the non-HD truck. In the early days, it was common practice to do separate HD and SD productions. Today, of course, one HD production feeds all, and it typically uses as many cameras and as rapid cutting as an SD show.
Atop a tower of Fujinon’s NAB booth, Pace showed something that recognizes the current economics of 3D. With virtually no 3DTV audience, it’s hard to justify separate 3D productions, but, with such major players as ESPN, DirecTV, Discovery, and Sky involved in 3D, the elephant cannot be ignored, either. So the Pace Shadow system places a 3D rig atop the long lens of a typical 2D sports camera. Furthermore, it interconnects the controls (in a variety of selectable ways) so that the operator of the 2D camera need not be concerned about shooting 3D: one camera position, one operator, different 2D and 3D outputs.
Screen Subtitling came up with similarly clever solutions to the problem of 3D graphics. Unless text is closer to the viewer (in 3D depth) than the portion of the image that it is obscuring, it can be uncomfortable to read.
Traditionally, subtitles are at the bottom of a screen, where 3D objects are closest to the viewer. Raise the graphics to the top, and they might work in the screen plane.
Then there’s the issue of putting the graphics on the screen. With left- and right-eye views, it might seem that two keying systems are required. But with much 3D being distributed in a side-by-side format, a single keyer can place 3D graphics directly into the side-by-side feed.
There was much more 3D at the show, in every field of video technology (and, perhaps even audio). In acquisition, for example, aside from integrated cameras, 3D mounts, and even individual cameras designed specifically for 3D (like Sony’s HDC-P1), there were also 3D lens adaptors, precision-matched lenses, precision lens controls, and even relay optics intended to allow wider cameras to be placed closer together, as in this picture shot by Eric Cheng of WetPixel.com: http://wetpixel.com/i.php/full/2010-nab-show-report-las-vegas/
At the other end of the 3D chain, there were both plasma and LCD autostereoscopic (no-glasses) displays using both lenticular and parallax-barrier technology, small OLED displays with active-shutter glasses and giant LED screens with passive circularly polarized glasses. There were LCD and plasma screens (up to 152-inch at Panasonic) and DLP rear-projectors using active-shutter glasses, and both LCD and laser projection using passive polarized glasses.
There were dual-panel displays with beam splitters, and displays intended to be viewed through long strips of fixed polarized materials (to accommodate all viewers’ heights). There were many anaglyph displays in the three-different primary-and-complement color combinations. There were 3D viewfinders using glasses and others with displays for each eye.
Japan’s Burton showed a laser-plasma display that creates 3D images in mid-air. Normally, they’ve viewed through laser-protection goggles, as in the image at the right at the top of this post. But as a safety measure, they showed them instead inside an amber tube at NAB.
In storage, it seems that everyone who had anything that could record images had a version that could do so in 3D. Even Convergent Design’s tiny Nano was available in a 3D version. The Abekas Mira is an eight-channel digital production server — or it’s a four-channel 3D digital production server. Want an uncompressed 3D field recorder? Keisoku Giken’s UDR-D100 was just one such product at the show.
In processing, just about every form of editing and processing had a 3D version. Monogram showed a touch-screen 3D “truck-in-a-box” production system. Belgium’s Imec research lab even showed licensable technology for stereoscopic virtual cameras.
There was a range of equipment and services for converting 2D to 3D either in real time or not, automatically and with human assistance. And there was a large range of processing equipment designed to fix 3D problems, such as camera rotation and height variation.
Sony’s MPE200 is one such device, with a U.S. list price of $38,000. The MPES3D01/01 software to run it, however, is another $22,500. With the least-expensive 3D camera at the show (Minoru 3D) retailing for under $60 at amazon.com, it might be said that 3D is cheap, but good 3D costs.
There was 3D test equipment from many manufacturers. There was high-speed 3D (Antelope/Vision Research). There was 3D coax (Belden 1694D, complete with anaglyph color coding). Ryerson University is doing eye-tracking research on what viewers look at in 3D and whether it’s different from HD and 4K.
So why was I wondering what year it was? At NAB shows there have been many technologies shown that never went anywhere. We still await voice-recognition production switchers, for example, and also voice-recognition captioning. But those have generally been shown by only one company or a small number of exhibitors.
Digital video effects were among the fastest technologies to penetrate the industry. First shown at NAB in 1973, they were commonly seen in homes by the end of the decade.
Then there was HDTV. Its penetration after NAB introduction took much longer, even if dated only from 1989, when an entire exhibition hall was devoted to the subject (there were many earlier NAB displays). Estimates vary, but U.S. household penetration of HDTV 21 years later seems to be in the vicinity of half.
At least HDTV did eventually penetrate U.S. households. Visitors to NAB conventions in the early 1980s could see aisle after aisle of exhibits claiming compatibility with one or both competing standards for teletext. One standard was being broadcast on CBS and NBC; the other on TBS. There were professional and consumer equipment manufacturers and services offering support. Based on the quantity and diversity of promotion at NAB, it was hard to imagine that teletext would not take off in the U.S.
So, will 3DTV emulate digital effects, HDTV, U.S. teletext, or none of the above? Time will tell.