SAMS Forum: With Growth of Centralized Replay, Leagues Look To Leverage Content Explosion
In an effort to get the officiating call right on every play, all four major leagues have built massive infrastructures to ingest as many angles as possible from every venue. Although it has been a benefit to viewers and improved officiating accuracy, it has also created a massive amount of content for leagues and broadcasters to manage and store. The issue — discussed at SVG’s Sports Asset Management & Storage Forum last month — becomes how leagues, broadcasters, and teams can retain important iso feeds and multiple angles of key moments without stretching their facilities too thin.
NHL Replay Expansion Added Pressure on Storage
The NHL was the first of the four major U.S. sports leagues to adopt a centralized replay center: its “Situation Room” in Toronto brought back 14 feeds from every venue for every game last season. The feeds comprise two program feeds and 12 camera isos, including GoPro HeroCast cross-bar cameras embedded in each net and blue-line cameras added at every venue prior to the start of the Stanley Cup Playoffs for coaches’ challenges of offside calls on goal-scoring plays. The NHL is now looking to upgrade the blue-line cameras to 1080p for next season to enable crop-and-zoom capabilities. The bulk of these feeds are ingested into Toronto’s Sony Hawk-Eye official-review system and stored until the next game and then disposed of. However, the league is looking to retain this content long term.
“There has been a lot of debate as to why am I saving goal-post cameras that are looking at a goal line and a goalie’s skate 90% of the time,” said NHL SVP of Technology Grant Nodine. “I think we’ve made the conclusion that we are going to save all that stuff in the archive. Tape is cheap, so it makes sense just in case. But it has been a challenge to get our heads around doing that.”
NBA Replay Center Continues To Evolve
The NBA’s two-year-old Replay Center in Secaucus, NJ, takes in nine cameras from each of the league’s 29 venues via a dual 10-GBps network (HD-quality video at 250 MBps per stream). All the video is time-coded at each venue and recorded into a nearline disc array for officiating needs. The high- and low-resolution video is moved to separate nearline storage for immediate retrieval should the NBA need to restore a specific piece of content. It is then moved into long-term storage and available for lo-res–proxy viewing via a web-based application.
Today, the NBA Replay Center is about much more than just official replay: the league is looking to further distribute content. Prior to last season, the NBA built another production room, where an operator monitors every home and away game and provides custom in-arena feeds for international broadcast partners to use during commercial breaks. The league is providing this feed to 215 broadcast partners worldwide as well as to digital outlets.
“It’s quite incredible what we’re doing from a distribution standpoint,” says Chris Halton, VP, media and distribution technology, NBA. says Chris Halton, VP, media and distribution technology, NBA. “We have all the feeds, so we thought, what can we do with this to make a new product? Instead of just taking content in, we are feeding it back out to the network. We are pushing [feeds] to the middle of our core network, where we have two different peering locations. That allows partners to consume those IP feeds right in the middle of the network. That is making our ability to distribute our content internationally a lot easier.”
You Can’t Find Content Without Metadata
Metadata is as important as your content. That was the overwhelming message throughout this year’s SAMS Forum. As a result, sports-media organizations increasingly emphasize accurate and detailed logging.
“Across the board, we are seeing a revolution in logging,” said Nima Malekmanesh, product marketing manager/senior engineer, DreamCatcher, Evertz. [In the past], it was always your replay operators doing the logging, and logging wasn’t really important. I think everyone is realizing that logging and metadata are as important as your content: if you have your content and you can’t access it and there is no way of distributing it out properly to your users, that content is useless. Metadata is the unifying layer across all your different archives — immediate, mid-tier, and deep — and the only element that talks across all these layers is the metadata.”
The NBA’s logging process is meticulous to say the least: Halton joked that “in any given clip, we know if the mascot is on or off the court and if it was a blow-up costume or furry costume.” However, this offseason, the NBA is rethinking its logging workflows in order to make the most pertinent content available immediately to serve today’s up-to-the-second social-mediasphere. For example, Halton said the league plans to monitor live Twitter analytics during each evening’s games to develop a heat map of trending topics and games to more efficiently produce and clip high-value content for distribution to social media and mobile apps.
“We can’t wait for something to be completely logged [before] our system can do something with it. We get closed captioning, we know commercial breaks, we have statistics. The entire game is logged, so let’s take that data and utilize it [automatically],” said Halton. “We want it fast, first, and accurate upfront to get it out to those social tiers and mobile. Then we can peel back the onion [for] the longer term’s deeper level of logging but not doing everything before everyone can take advantage of it. That’s what we’re working on this summer.”
Nodine also noted that, while organizations’ logging procedures are becoming more sophisticated, the various levels of metadata created by different departments further complicate matters.
“You also have a lot of people within the league that are using the video for different purposes and are logging it in a way that we are not necessary capturing right now [but will be using] going forward,” said Nodine. “That’s the kind of logging that has a limited audience because, essentially, you’re creating a private log for a small group of people: certain metadata is available only to certain users within your organization. That is where the explosion in metadata starts to happen and where the synchronization becomes a big deal.”
Is Cloud an Option?
As more isos and alternative angles are stored, the more onsite storage will be needed to house this explosion of content. As a result, when to use cloud-based storage becomes key.
“I think cloud is brilliant for disaster recovery of proxies and things like managing survivability of your metadata,” said Nodine. “But putting your entire archive there is a bit silly right now. It makes sense if you have two year’s worth of content but no sense if you have 75. That is just where it is right now. Also, HSM tools like [Oracle’s] DIVA are very mature pieces of software, and anything they are doing in the cloud … is very raw right now. I think, until that shapes itself up more, no one is comfortable with that.”
Halton added, “It depends on the use case. If you are talking about being the official recordkeeper and archivist of a major sports and entertainment company, you have to think about owning the quality of the archive, storing that in the highest bitrate you possibly can. Cloud storage at that level gets costly and time-consuming – especially when you’re trying to turn things around at that high a level. I personally see [cloud] as something that might evolve over time but right now is something more for finished product for end-user consumption on mobile devices, desktops, and so forth — which we pretty much all do already.”