SVG Sit-Down: Wheatstone’s Dee McVicker on AES70
The recently published standard addresses control of diverse elements in networked audio
Published in January, standard AES70 arrives on the professional-audio scene as a corollary to AES67. The latter offered an intercompatibility solution for the increasingly diverse networked-audio landscape; AES70 is here to add a layer of intercompatible control to that thicket of formats. SVG sat down with Dee McVicker, spokesperson for Wheatstone, which was part of the AES X210 task force responsible for developing AES70, to discuss what AES70 means for remote broadcast production and beyond.
What does AES70 mean for broadcast audio in the field? Networked audio and MADI coexist in remote production, but does AES70 have the potential to change that balance?
AES70 could be useful to sports broadcasters who want to move production to the home studio because it’s anticipated that it will give them a standard of control and logic between the studio network and the devices and elements out in the field. MADI, as you know, doesn’t provide for control and command. It is certainly useful for getting around a lot of uncompressed audio signals, and there is still a lot of MADI gear out there. But, in order to use AES70-compatible gear or software with MADI, broadcasters will need some sort of IP connectivity if they want to run control for those channels. MADI and networked audio will continue to coexist, but maybe not in the way that they do today.
What does AES70 bring to the larger broadcast-audio environment? What kind of functionality?
It has the potential to broaden the usefulness of audio networking. AES70 is based on the Open Control Architecture (OCA), which is essentially a library of specific control functions, such as on/off, level control, and so on. Manufacturers will have a common set of commands to facilitate control between a third-party device and their networks, and that’s huge in terms of network expandability.
How will it fit into the existing broadcast-audio workflow?
Most existing audio networks are comprehensive, end-to-end environments born out of the need to be all things to broadcasters. For example, the WheatNet-IP audio network is a complete studio environment with control surfaces, software, control panels, widget GUIs, audio controllers, and all the elements needed to run a broadcast facility today as a single integrated system — plus all the logic needed to control those elements. At this point in the evolution of audio networks, AES70 will provide add-on capability to these full networks. It will provide a path for interoperable control between existing network elements and third-party devices not supported by the WheatNet-IP audio network ACI, for example. Inevitably, that can only broaden and simplify workflow.
Is AES70 a logical extension of AES67, which provides a transport standard that all audio-network manufacturers could use? Could it accelerate the uptake of audio networking in broadcast?
They each serve a very specific purpose. Whereas AES67 is about transporting audio signals between networks, AES70 is about the basic control needed for adding non-network elements to an existing network. I’m not sure I’d credit either of these standards for accelerating the uptake of audio networking per se, but they certainly do make the road to IP smoother for broadcasters who are making that transition from HD-SDI to IP.
What are the next steps involved for AES70? Will manufacturers have to have devices integrating it tested for compliance?
It will take some time before AES70 can be integrated into the studio environment. As with all standards, we’ll want to test it out in real-world scenarios before we build it into our products.
You’ve analogized AES70 to MIDI. Can you elaborate on that a bit? Given the music-production backgrounds of so many professionals in broadcast sports, it’s a comparison that will resonate.
Wheatstone happens to have more than its share of talented musicians on staff, so we naturally thought of MIDI as an analogy to this new standard. Actually, my understanding is that MIDI control functions were a driving force behind OCA development in the first place. MIDI turns out to be a very accurate analogy, too, because it is essentially a common library of values and functions that the music industry uses for communicating how instruments and other devices control each other to trigger or synthesize sound. Similarly to AES70, MIDI gives a standard set of values — in this case, the values needed for conveying a note’s timing, musical pitch, length, volume and so forth.
Now that we have format compatibility and control of that transport, what’s the next related task that AES will be looking at?
It’s hard to say, but I understand the AES67 standards group has been working with the VSF TR-03 and ASPEN groups to coordinate with upcoming standards for IP video. Ultimately, all of this will need to converge as we change our thinking from audio, video, and data to content.