Fall Tech

SVG Tech Insight: Remote Production: Why Synchronization Matters for Live Sports

This fall SVG will be presenting a series of White Papers covering the latest advancements and trends in sports-production technology. The full series of SVG’s Tech Insight White Papers can be found in the SVG Fall SportsTech Journal HERE.

Introduction

Nothing engages viewers like live video. Whether it’s for television or corporate communications, live video can make people feel as if they are truly taking part in the event, no matter where they are watching from.

For producers with tight budget constraints, planning for a live event involves tough choices between deploying remote production staff and the cost of transmitting video. Traditionally, live production of remote events requires an onsite crew of camera operators, sound engineers, and a technical director. Adopting a remote production model can reduce production costs and logistical complexity by reducing the burden of deploying expensive resources: the equipment required to capture, process, and produce at a remote venue and the field crew needed to set up, operate, and manage it. Creating greater efficiencies allows broadcasters to produce more events and deploy their best resources more effectively.

Although costs can be significantly reduced by managing live production workflows from a main master control room (MCR), sometimes referred to as remote integration or REMI, the additional bandwidth typically required for transmitting multiple contribution video feeds over satellite or a dedicated network can negate the savings of having a centralized live production facility.

In this white paper, we will explore how broadcasters can leverage the latest video streaming technologies to satisfy the demands of remote production workflows without the traditional costs and logistical complexities.

The Challenge of Synchronizing Multiple Camera Streams Over IP

While broadcasting live events, the use of multiple cameras allows for a more engaging and dynamic viewer experience. For remote locations such as sports stadiums or concert venues, a producer needs to be able to seamlessly switch between live video feeds depending on what angle is most suitable at a given time. Typically, a single audio stream is used, as sudden changes in audio are very noticeable and can be distracting. If the video is not synchronized, switching between cameras can result in issues such as input lag and lip sync delay.

At the live production facility, decoders receiving the live feeds need to be kept in sync so that a producer can immediately include any of the sources within their live playout workflow. One way to help mitigate multi-camera and audio sync issues is by multiplexing camera feeds over satellite uplink, although this can be a costly solution. Another option is to use a dedicated private network that can provide a stable level of latency and therefore the ability to manually sync video and audio feeds, although this is not always possible from remote locations.

Remote production workflow with synchronized video streams.

Streaming over the internet is a more cost-effective and flexible approach; however, bandwidth availability is difficult to predict and can change at any given moment. Being able to synchronize remote contribution streams over the internet resolves the dilemma between managing costs and ensuring broadcast quality.

Keeping live video and audio in sync while streaming over IP networks can be a considerable challenge. Especially when dealing with an unpredictable network like the internet where round trip times and bandwidth availability can continually fluctuate.

In order to ensure that all video and audio streams are in sync with each other, broadcast and network engineers need to spend considerable time manually adjusting the timing of each video decoder output. Typically, this is done using a test pattern device to calibrate audio channels with live video sources. This approach requires coordination between people at both the remote location and at the MCR and can be very time consuming. The more cameras and audio channels involved, the more complicated it becomes to synchronize everything, and the more time needed before going on air. Although with the right tools, this approach can be made to work, there is a simpler and faster way.

The Stream Sync Solution

Haivision’s Stream Sync solution automates and simplifies real-time frame alignment. Stream Sync is supported by the Makito X Series of video encoders and decoders, including the new Makito X4 encoder and decoder for 4K or quad-HD video. These Haivision devices are configured to stream multiple channels of live event video that are kept in sync, accurate to within a single frame. Stream Sync works by continuously monitoring the end-to-end transit time, and dynamically adjusting the internal decoder buffers to compensate.

Stream Sync enables broadcast engineers and producers to capture multiple live video and audio streams from a remote venue and keep them all in sync for immediate use. Makito X and X4 video decoders ensure that live feeds are synchronized so that downstream production equipment will not experience issues when switching between video and audio sources.

Stream Sync continuously monitors the characteristics of the streams and the network and applies the exact amount of buffering required to ensure smooth and synchronized playout across multiple feeds. This is done in real-time based on timestamps embedded in each stream from the remote Makito X or X4 encoders. For live production, this means that any camera can be used with any audio track, with no noticeable video hits or loss of lip-sync.

How Stream Sync works in a remote production workflow.

For Stream Sync to work, cameras need to be genlocked and the Makito X and X4 video encoders synchronized to an NTP server designed for broadcast applications. This can be easily configured through the video encoder GUI, which provides a way to specify the NTP server used and ensure that all outgoing streams are time-stamped in sync.

Stream Sync not only benefits broadcasters, but also provides companies, nonprofits, and government agencies a way to deliver broadcast-quality live event coverage. Corporate training, executive communications, and webcasts can all be more engaging using multiple cameras with none of the distractions of out of sync video and audio streams.

With the Makito X4 encoder and decoder pair, four HD video streams can be kept in sync with Stream Sync using only one device on location and another in the production studio.

Conclusion

New technologies for remote production workflows over the internet — such as Haivision’s Stream Sync — are allowing broadcasters to cover a wider range of live events, including sports and news gathering, without the costly overhead of deploying production teams and OB trucks to each site, or transporting video over satellite or dedicated networks. Being able to sync remote video streams over the public internet is more cost effective and flexible than using satellite or private managed networks. It enables any type of broadcaster to live stream events with multiple camera angles from any location with broadband internet access.

Password must contain the following:

A lowercase letter

A capital (uppercase) letter

A number

Minimum 8 characters