Sail Grand Prix Season-Ending Race in San Francisco Points to Power of Data, IP
All production is handled by a broadcast facility in the UK
As Sail Grand Prix concludes its 2021-22 season this weekend in San Francisco, it continues to prove just how quickly new remote-production workflows can evolve. Onsite presence is minimal, with camera signals, audio signals, and heaps of data sent from the boats to the shore via RF and then via SMPTE ST 2110 transport halfway around the world to a production team at Timeline’s Ealing Broadcast Centre in the UK.
“It seems a long time ago that we started this season in Bermuda, and the world’s a little bit different than it was a year ago,” says Warren Jones, chief technology officer, SailGP. “It was tough for us to move around back then and do what we needed to do. And our remote production was essential for us.”
Warren says the production team was lucky because there were no legacy systems, workflows, and contracts to deal with. Eight national teams compete on identical F50 foiling catamarans capable of reaching speeds of up to 60 mph. Both the boats and the athletes who sail them have sensors that collect and stream reams of data for every competition.
“We had a blank sheet of paper, so it was how do we want to do it,” he recalls. “Do we want to do what we did two years ago for the America’s Cup? Or do we want to look forward and have something that is future-proof for the next 10 or 15 years?”
The result is a workflow in which most of the people, including umpires, are in London at a control room at Timeline’s facility. SMPTE ST 2110 is a core technical piece, enabling the team to transport signals to Timeline from anywhere in the world and also have a return path for monitoring the output from the race site.
“We send the finished world feed back to the race site, where it is shown in the hospitality areas, our media center, and then the big screens,” says Jones. “Nothing is produced onsite.”
Except for cameras, microphones, and Riedel Bolero intercoms, all the equipment is located in London, including the augmented-reality graphics engines, which play a big part in visualizing all data. And there is a lot of data capture via pressure sensors, gyroscopes, GPS, and other technologies.
“We have 30,000 data points per boat, and, at the end of the day, we’ll have something like 40 billion data points,” says Jones of a data throughput that exceeds 15,000 messages every 500 ms. “We’ll know everything, including the stress on the foils, the hulls, pressure on the trampoline. We’re lucky to have a partner like Oracle and the ways that they handle data. When you see the database and how many lines of code is in there, it’s amazing.”
All the data goes via RF from the boats to the shore and via fiber into Oracle servers in London. That is the basis for nearly everything that helps visually tell the story of the race via graphics and other elements. And all of the data is transmitted within 180 ms.
“There is a program called Oracle Stream Analytics where we have predetermined patterns to define about a thousand different metrics that we use,” Jones explains. “We use a Kafka Bridge to get the metrics into our augmented-reality graphics, our 2D graphics, our sales app, and any third party or anywhere else we want to display that information.”
Joining the data on the journey to Timeline are video signals from two cameras on each boat, a camera in a helicopter, cameras on three chase boats, and two ENG cameras.
“All those cameras go via our network to London, where the show is produced,” says Jones. “Then we distribute them from London to our partners at Sky, CBS, Canal Plus, and Fox Sports in Australia.”