Case Study: RSN Group Revamps Media Asset Management Workflow With Primestream
A regional group of four production and distribution centers across the US – set up to provide A-level regional sports coverage – continues to push new technological boundaries, even with its regional sports budget. The four networks reach over 13 million households across 22 states and own exclusive rights to produce and distribute live events from more than 25 sports teams and conferences.
As part of a move to futureproof its production infrastructure, the organization decided it needed to implement a new media asset management (MAM) workflow and create a high-availability virtual environment with individual clouds for each region, as it was essential that they had the flexibility and agility to respond to changing market dynamics. The idea was that the virtualized environments would also enable the creation of “replication partners” for data in the event of a catastrophe in one of the regions. All of this had to be achieved on a regional sports budget that is tighter than ever.
The four facilities have traditionally had very different production needs and worked in a very autonomous way with legacy hierarchical folder structures in place for many years. Any move to a new infrastructure and workflow had to be managed in a way that made it as easy as possible for engineering and production staff to make the transition.
As the sports broadcaster’s Chief Technology Integrator (CTI) explains, “We wanted to deploy tools into all regions in small digestible chunks that would allow users to adjust their workflow slowly.”
To accomplish this, the organization decided to build a back-end consisting of a Windows infrastructure and IBM hardware to be deployed across all regions, with efforts to replicate each system as closely as possible.
An IBM TS4500 with Spectrum Archive was eventually chosen for the project. In addition, Primestream’s FORK asset management platform was chosen following intensive evaluation of a number of different MAM solutions available on the market.
“This gave us a GPFS file system on top of the LTFS, but we needed a vendor who was willing to make those solutions work with their systems,” adds the CTI. “This proved to be a very tall order, but Primestream told us they were willing to accept these challenges.”
The entire architecture for each of the four facilities sits on a virtual datacenter, supporting everything from production servers and the RFUs to the E2P and the Primestream FORK infrastructure. The facilities with MOS have virtualized news-maker gateways and virtualized a news-bridge, through IIS on the news-bridge server to host the proxy files that show up in the Octopus newsroom system.
Primestream’s FORK system allows for tagging, managing, and baseband ingest and playout of clips, plus MOS driven playlists that are run through the Octopus NRCS. As the sports broadcaster’s Chief Engineer, explains, “FORK also includes action tasks that can save users dramatic amounts of time by simply selecting a repeatable task from a drop-down list such as ‘send to FTP’.”
“Our previous media asset management system was not reliable in getting assets checked in or identifying the assets. Primestream allows us to find stuff as we need it with relative consistency.”
In addition to its deployment of FORK, the organization also chose to add Primestream’s FORK Xchange browser-based tool — initially in one area, with plans to roll the system out across the rest of the group — which allows users to access published FORK resources.
As the CTI explains, the benefits of adding Xchange to the mix are manifold: “It eliminates the thick client requirements of FORK and allows everyone from truck personnel to producers in hotel rooms the ability to browse, log, and download files. It also gives them a file-based upload tool, meaning something like a Starbucks WiFi can be turned into a feeding point on the road — which is a massive advantage for studio produced pre- and post-game shows because they can get clips as files on the road rather than through a liner baseband feed.”
One of the “home” shows offers a perfect example of how FORK has delivered workflow efficiencies. For baseball broadcasts, the organization has four live feeds: one coming from the OB truck, two slo-mo feeds that are ingested separately, and a fourth feed for the post-game show. For the post-game feed, there is content from outside the clubhouse, including locker room sound at the conclusion of the game.
As the organization finds that there is content they don’t want to keep forever, FORK allows them to monitor it, while ingesting it — and they can set markers and create sound clips. Most of the time, the clips are just brought in whole to Adobe Premier, chopped up, and exported back into FORK for playback during pre- or post-game shows.
Summing up the difference that FORK has made to the workflows, the CTI concludes, “It is night and day. The system we had previous to FORK was laborious to have to maneuver through and around. Immediately with FORK, our workflows were much improved.”