FutureSPORT Summit Reaches for Clouds
Applying “cloud-computing” concepts to remote sports production is really not new: broadcasters have been pushing and pulling content from a remote venue to a broadcast center via fiber connections for some time. But, as the capabilities expand, so do the opportunities for new workflows and new cost savings. And some of those were discussed during a panel session at the NYC FutureSPORT Summit last week in New York City.
“It’s become more and more established the past two years as trucks and remote production facilities have become directly integrated into the broadcast center,” said Jay Deutsch, senior director, projects and systems architecture, EVS. “The production trucks of the future will need to address these kinds of needs so that they can enable remotely connected workflows. And networks are starting to rely on this mechanism to get the content back during the production instead of a day later.”
Bruce Goldfeder, VP of engineering, CBS Sports, discussed some of those changes, noting that technologies from companies like Brevity and Broadcast Fusion have allowed promotions that run during an NFL game to be delivered via Level 3 circuits providing a 100-Mbps connection to the broadcast center. That removes the costly process of having promos messengered from the CBS Broadcast Center in New York to NFL stadiums, saving tens of thousands of dollars each NFL season.
“We’ve probably saved 20%-30% of our shipping costs, but we have increased our productivity by leaps and bounds,” said Goldfeder. “And then, during the games, the melts can be sent back to the broadcast center as the game is going on so that we can shut down the trucks an hour after the game.”
The challenge across the board for these new developments is available bandwidth. Some facilities, like professional stadiums and arenas, are fairly well fibered up with plenty of available bits. But, for college venues, golf courses, and other locations, other technologies are required to help “accelerate” the file delivery.
ONE CONNXT, for example, helps deliver high-quality SD or HD media from point A to point B with the help of the open Internet, delivering more than 4,000 events a year from college campuses. CTO Paul Dingwitz explained how waiting on files to be moved can impede workflows.
“Tier-two and -three colleges have struggled with the economics of getting high-quality content out without spending a lot of money,” he said. And then there are the struggles with the IT department on a college campus, a gatekeeper that can bring any sports production to its knees.
“We operate on the public Internet so we don’t have to fight with the IT department,” he said. “So it’s about finding vendors that mitigates those sorts of challenges.”
And then there is the need for simplified workflows.
“You need to turn the system on, walk away, and it works without the need for three engineers,” added Dingwitz. “Our system can snap into existing workflows without a big impact or extra personnel.”
Epoch CEO David Barton pointed out that having too much manual engineering involved also cuts down on efficient use of bandwidth.
“The last thing you need are more bottlenecks and there are computer and network resources lying around like sand on beach,” he explained. “So you need to have something or someone that directs all that [data] traffic and makes it all work together.”
So what’s next for cloud-based sports production? Deutsche predicted that one of the big next steps would allow staff at the broadcast center to reach into a truck and pull out a needed video resource as opposed to simply waiting for what the truck sends out.
“Production teams want what they want when they want it, and we are starting to catch up and do that and whet their appetite for products that are going to directly benefit them.”
Deutsche predicted new products that accelerate and eliminate latency, which will take some additional coding and technology development.
“Once those tools exist, there will be much less engineering time required,” he said, adding, “but it has to be easily deployed as trucks only have a couple of hours to get up and running.”
Goldfeder said future developments in truck builds that he and the team at CBS have seen are encouraging for those looking for more and more cloud-based production.
The move to 10-Gbps infrastructure, for example, could allow a lo-res proxy layer of content to reside inside the truck and be accessed from elsewhere. An editor located across the country from the truck could assemble five or six shots via those lo-res proxies and then have the high-resolution content sent to them without anyone inside the production truck even knowing the process is taking place.
“That is possible now, but it is convoluted and clunky,” added Deutsche.
But the future, it appears, is bright.
Said Goldfeder, “The new trucks are taking everything we have learned and moving it forward so it will be a snap to get things done.”