Live From Super Bowl LVII: Fox Sports Set To Take Skycam AR to New Heights
Racelogic, Sony help move AR for aerial camera beyond optical tracking
Anyone looking for augmented reality to take another leap will want to tune into Super Bowl LVII. After two years of development by Fox Sports, Sony, and Racelogic, a system that will be unveiled on Sunday definitely qualifies as a game-changer: moving beyond optical tracking.
“We’ve been trying to get a better tracking system for Skycam,” explains Zac Fields, SVP, graphic technology and integration, Fox Sports. “[Skycam] currently needs to be looking down at the field so it can optically track [its location]. That limits the speed of the movements but also the flexibility because you can’t have the camera look up and lose the ground.”
The system that will debut at the Super Bowl makes use of Racelogic RF beacons placed around the field and can keep track of where the Skycam is. In the Skycam, an RF receiver is mounted beneath the Sony camera, which also has an Inertial Measurement Unit (IMU). The IMU captures camera pan, tilt, and roll data, and the lens and focus data are sent directly from the Sony camera as a single data feed to the Unreal Engine, which renders the elements created by Silver Spoon.
“We’ll have a big open at the top of the show that will set the mood,” says Fields.
Racelogic’s system was originally built for tracking cars on a track to an accuracy within a centimeter. The two years of development time involved modifying the technology to track a camera and also ensure that it could work in a space as large as a football stadium.
“Then we had to figure out how to merge that with zoom and focus and pan and tilt,” says Fields. “We did two tests at Allegiant Stadium as they were nice enough to accommodate us so we could work through some things.”
Augmented reality will also be a big part of the Super Bowl LVII pregame show, including a first: AR from a techno-crane that can extend 75 ft., along with AR capabilities on a 9-ft. techno-crane on the pregame stage and a 26-ft. techno-crane on the demo field.
Explains Fields, “Those three are calibrated with Stype. We will also have a fourth camera, the Flycam. That one will use TrackMen optical-tracking–based solution, and that data gets fed to AR through Silver Spoon, which can do the rendering in the Unreal Engine. It will be nice because the elements we’ve built for outside will be a little more cohesive.”
One cool application to look for during pregame will be tactical analysis on the demo field. “There will be virtual Xs and Ox showing play diagrams for the talent,” says Fields. “We built our own application for that.”
He adds that one of the challenges coming into the game was to figure out how to get more bumper and sales elements in and move away from putting logos in monitors or lower-third graphics. “It’s a nice added production value to put it in AR, and sometimes it can more seamlessly integrate into things. The talent has gotten used to it, so you’ll see a ton of it throughout the show.”
Another graphics enhancement will be courtesy of SMT and its 3D Top Fonts and Telestration technology.
“We’re trying to get quicker turnaround for replays and packages of enhanced elements,” says Fields. “SMT is taking the Next Gen Stats data in real time, and they’ve created a tool that [allows the announcers to] select players and see a preview trail for a replay of where [the players are] going to run. It enhances the replays with data to help the viewer know what they’re looking at.”