Sports Asset Management: The Future of File-Based QC
Ingesting large files properly was one of the main themes discussed throughout SVG’s Sports Asset Management event in Charlotte on July 19. However, even with major organizations looking to chop down on budgets and manpower, there are standards that need to be met, from the standpoint of putting out a quality product to satisfying federally-mandated standards.
“I think all of our main goals would be to have an automated QC process,” said Mark Haden, VP of Engineering and IT at MLB Network. “Less hands on, less manual adjustments, less people means less budget and faster content flow and that is really important for sports and news organizations.”
While that perfect system may still be years away from being a reality, some of the top technology companies are continuing to make advancements to further streamline the QC process.
According to managing director Hank Frecon, RadiantGrid has been fielding requests to simplify the constant bouncing of files around a production facility, adding that his company was looking to transfer that process into a “more holistic architecture so that it’s a true parallelization.”
Harris’ Robert Millis agreed that a media organization needs to refrain from moving large files frequently, avoiding what the panel described as “ping pong.”
“You’ve got to worry about moving files around, even intra-facility as you’re going to lose some bits in them; some of that valuable metadata that’s not too protected,” said Millis, broadcast communications, senior product and project manager compressed systems – servers at Harris.
Frecon acknowledged that even while judging a file-based QC workflow that allows the user to autocorrect any files based on metadata, loudness, or field-level issues, final quality analysis is a process that needs to be shared in order to assure a fair assessment.
“It’s not appropriate really for us to use our own quality measurement tools to say ‘we think it’s the best’ because we’re a little biased on that,” laughed Frecon. “So rather we have been working to adopt tools from Harris, from [Interra] Baton, from Manzanilla and ultimately we want to bring those tools inside our processing architecture so that they’re really running in line with the application. So that way, customers can really try to tune in and if you know that you have a good loudness processor, you’re not wasting advanced cycles on processing loudness and instead you’re turning that attention to video or something along those lines.”
Aside from the desire to produce a top-notch viewing experience, loudness levels also must be met in order to meet the federally mandated guidelines outlined in ITU-1770 and the CALM (Commercial Advertisement Loudness Mitigation) Act, designed to protect television viewers from commercials that were significantly louder than the programs that they were aired with.
DaySequerra president David Day discussed what media companies and QC providers need to look out for in the current environment of the CALM Act.
“ITU-1770, interestingly enough, used radio programming as the standard to determine what television audio should be measured as,” said Day. “So what does 1770 not do well? Simple things like dialogue with no music behind it and low frequency content.
“I’m not here to argue against 1770. A flawed standard that you know what its faults are is better than no standard at all and that’s what we have right now.”