How Racelogic’s RF Beacons Took AR to New Heights at Super Bowl LVII
Aerial camera combined AirPixel tracking data with lens and focus data into single feed
A company in the sports-video–production industry with a name like Racelogic seems destined to have an impact in the world of motorsports. So, when the company made news at this year’s Super Bowl with Fox Sports (as well as at last year’s Final Four with then Turner Sports), the first mystery to solve was what is the connection to racing.
More on that in a bit, but first a quick refresher on the company’s role at the Super Bowl and the Final Four. With AR graphics increasingly used in sports productions, one limitation is that, because the a traditional tracking system relies on optical tracking, the camera must always have the field (or court/floor) within the frame so that the system can compute where the camera is physically located. As a result, AR elements can be laid only into cameras that are shooting downward.
That is where Racelogic and its AirPixel system fit in. The company places a receiver on the camera system and installs RF beacons around the shooting environment. The RF beacons keep track of where the camera is relative to the field; the camera no longer needs to keep the field/court/floor within the shot. And insertion of AR graphics in camera shots looking straight ahead or even at an upward angle are now possible.
For Racelogic, getting to this point has been a 30-year journey that began in 1992 when Managing Director Julian Thomas created high-tech electronic gearbox-control systems for race cars in Buckingham, UK (where the company is still based). The technology provided a bit too much of an edge in the racing world, and Thomas and Racelogic had to pivot. He saw an opportunity in the need for automotive designers to get better test data.
“The automotive companies,” he explains, “would strap a fifth, drag wheel behind the car to measure speed and breaking distance and when GPS became accurate and usable. I was one of the first to strap a GPS onto a car during testing. It was highly accurate and could measure everything.”
When autonomous-car development began, another opportunity came up as development moved from outside to indoors. Indoor testing allowed the testers to create artificial fog and other weather conditions without having to wait for Mother Nature. Being indoors, however, meant that GPS could no longer be used.
“We designed and developed an indoor positioning system,” says Thomas, “and now that is used on their indoor testing facilities. When COVID hit, the automotive industry went off a cliff, and then Ferry Bult[, business development manager, AirPixel product line, Racelogic] and I had a chat about using it in the film and TV industry. We took our beacon technology and made it more accurate so we could use it to match up augmented graphics with the position on the pitch: the AirPixel system.”
The RF-beacon technology works very similar to GPS: each beacon is the equivalent of a GPS satellite, and the beacons communicate with the receiver over ultra-wideband. Based on its communication with the beacons, the receiver calculates the location of the camera with great precision.
“We communicate with only one beacon at a time,” says Thomas. “It’s like a polling system but thousands of times per second; we can work out within 1 or 2 cm where the receiver is. And we have a very accurate inertial measurement unit built into the receiver that measures the pan, tilt, and roll of the camera.”
At Super Bowl LVII, the receiver was mounted beneath the Sony camera in the Skycam system flying over the field. The camera combined AirPixel’s tracking data with lens and focus data and sent a single data feed directly to the Unreal Engine–based Pixotope platform, which rendered elements created by Silver Spoon.
The technology has also found believers in Hollywood, which increasingly is shooting in massive virtual studios with massive blue sheets. Disney’s 2022 Pinocchio, for example, was shot with blue sheets more than 60 ft. high. In film production, the AirPixel system helps the director and the production team roll an AR element into a green-screen studio to see how it will look and make changes.
“They had this big blue room,” notes Thomas. “There was no reference for anything, and the director wanted to be able to see where elements like the sea or fairground were in a shot. That’s where we cut our teeth as well as allowing him to see in real time the graphics in the background.”
The first live broadcast that Racelogic was involved with was the 2022 Final Four.
“Lee Estroff[, director, technical operations, WBD Sports,] took the risk,” says Bult. “That was a hard one for us, but we learned a lot, and it went very well. And then we did a couple of tests with Fox Sports.”
The AirPixel system can also communicate with up to five cameras at once (that number will soon be increased to seven), enabling production of multiple camera angles of the same AR object.
“The main goal is combining camera tracking with AR, not tracking cars,” says Bult. “We could do other things, but we’re looking at camera tracking and advanced AR for things like film visual effects and now sports.”
Working in broadcast, especially live sports, is more challenging than Hollywood because things cannot be fixed in post. Today, the company has offices not only in Buckingham but also in Detroit, Germany, and France, and other offices are in the works.
“Live sports and production are more demanding,” says Bult, “as you aren’t on a film set, where you can control the environment. It’s a lot harder, but the potential in the broadcast market is a lot bigger.”