SMPTE Attendees Get Inside Look at 3D Olympics Challenge
SMPTE attendees at the 2012 Technical Conference and Exhibition this week in Los Angeles were provided an overview of the technical challenges overcome during the 3D coverage of the London Olympic Games. The herculean effort brought together various OB and technology providers and stands as the high-water mark for the amount of 3D content created at a single sports event.
Jim Defilippis, 3D Olympics technical director, who played a key role in the 3D production of the Games, provided an overview of a production that pushed the boundaries of 3D technology with the help of three production partners: Telegenic, Alfacam, and Euro Media Group. It pushed those boundaries despite a late start.
“We had eight months to sort out the 3D project,” he said. “The concept was to create a channel that had up to 18 hours a day of 3D programming and an hour daily of wrap-up.”
The main hub of the 3D operations took place in a stacked cabin right outside the International Broadcast Center. It was there that content was brought in from the venues and edited on four editing suites based on the Avid Media Composer 6 editing system. Full transmission, playout, archive, and quality-control systems were also located in the cabin.
During the Games, a wide variety of events were covered in 3D, included the Opening and Closing Ceremonies, gymnastics, basketball, track and field, aquatics, and white-water events like slalom canoe and kayaking. The Telegenic OB units were located at the Olympic Stadium for the Ceremonies and track and field and at Lee Valley, where the white-water events were held. Alfacam was at North Greenwich Arena, where the gymnastics events were held, and Euro Media Group was at aquatics.
A total of 12 3D camera systems were used for the events, including four side-by-side 3D rigs with 22×7.6 lenses, five handheld Panasonic AG-3DP1 units, and three mirror-based rigs. Specialty cameras included a technocrane, a polecam, and an underwater camera at aquatics. Conversion of 2D images to 3D also took place via a system that could shift the 2D images left and right.
The late start caused some issues in terms of camera placement because the 2D camera plan was already fairly well set.
“In some cases, the 2D cameras would get into the images,” Defilippis noted, adding, “The big problem with a live 3D event is that the referees or a coach can move into the field of play and you would have no other option but to cut to a different camera angle.”
Although 3D production processes have made many gains in the past two years, there are still issues to be grappled with. Zoom differences between the left and right cameras, nonsynchronous images, focus mismatches, CMOS rolling shutter, and vertical disparity between the left and right cameras all challenged the quality-control teams.
“Lens flare was also a big issue,” added Defilippis of an issue that may never be solved because both the left and right lenses have different flare issues given their relative locations to the light source causing the flare.
The biggest challenge was attempting to avoid mixing up the left and right eye as heavy reliance on a postproduction process provided ample opportunities for mixing up the two 3D signals.
But steps were taken to cut off problems before they could occur. The P2 cards, for example, were marked with colors and labels before being placed in the camera. Also, camera operators were told to record 10 seconds of color bars and then check the bars before recording.
“The reason we would do that is, if they could not play back the color bars, it was because the cards were reversed,” Defilippis explained.
The Panasonic AG-3DP1 camcorders played an important part in the production, allowing for interviews with athletes and also able to be quickly deployed to provide additional coverage angles.
The camera offers three modes of operation: a near mode for objects within 1.1 to 3.4 meters and with 7X zoom; from 1.7 meters to infinity with 7X zoom; and from 1.7 meters to infinity with 17X zoom.
“The camcorder also has a 3D assist mode with things like parallax alert. But there is no 3D display, so there is no way to look at 3D material in the field,” says Defilippis. “But one advance is that it can automatically correct for vertical disparity of the two lenses. It worked very well when the cameramen followed the proper procedure.”
The 3D editing process proved particularly challenging for the team. For example, three media wrappers were placed around 3D-content files: P2 material wrapped in MXF OP ATOM at 100 Mbps, content on EVS servers in MXF OP at 100 Mbps, and content on the Avid Media Composer 6 editing systems wrapped at AVC-I at 100 Mbps.
“It was a manual process to convert the file wrappers, and AVC-I is very processor-intensive and requires long transfer and render times [of three times real time],” says Defilippis. “And the transfer status from EVS was only available on the IPDirector, and the transfer status from the Avid systems was only available on the Media Composer.”
Quality-control tools on the editing systems also represent a technology gap. A secondary quality-control position run by the media managers caught issues with the help of Panasonic 3D LCD monitors, but, even then, communication of errors could be hard to verbalize because 3D-error phraseology is not well-known among industry professionals.
“Rendering out the left- and right-eye [signals] properly is also tricky, as users can easily get confused and either mix up the left and right signals or render out two left- and two right-eye signals,” adds Defilippis. “Once on the EVS servers, the files had to be grouped and then identified to play out as a stereo pair.”
With the London 3D Olympics behind the team, the question now is whether the 2014 Winter Olympics will provide another opportunity to refine the production process. That decision, it appears, remains in the hands of a consumer marketplace that has yet to warm up fully to the in-home 3D-viewing experience