SVG Sit-Down: The Switch CEO Scott Beers on Cumulus and the Future of At-Home Production

New service envisions sharing infrastructure for a wide variety of clients

The Switch made a splash at NAB 2017 when it introduced Cumulus, its cloud-based, real-time production service that promises to bring the shared-services model to at-home production. Cumulus leverages The Switch’s FiveNines transmission network to deliver live productions from the stadium using production-control surfaces (such as robotic camera control and graphics system) that can be located anywhere in the world.

In the Cumulus demo at NAB 2017, a SkyCam Wildcat Aerial Camera system at Dick’s Sporting Goods Park in Denver was controlled from the show floor. The demo also featured SMT’s Camera Tracker virtual graphics and Brainstorm Teletransporter, incorporated live sources from The Switch studios in Los Angeles and London and integrated a live VR camera provided by C360 into the linear-TV–production demo and simultaneously, Over-The-Top. Then, last week, ESPN became the first to test Cumulus and SkyCam’s SkyCommander system on a live sports telecast during the NCAA Division I Women’s Lacrosse Championship at Gillette Stadium in Foxboro, MA.

SVG sat down with The Switch Chairman/CEO Scott Beers to discuss why he believes the time is right for Cumulus, how he thinks the market will react, the target client for the platform, and why he sees Cumulus as the future of his company.

At-home production has been a huge trend in live sports over the past few years. What events led to the launch of Cumulus?
We have been more heavily involved in at-home and home-run production than any other provider, and we’ve been doing it for a while now. We had to prove the reliability of that model first, and I think we have done that.

Scott Beers: “The customer wins in cost because they do not need to send people onsite; that is typical home-run economics.”

We have been doing many different types of at-home productions. There are some that take all the camera [feeds] back and do a full REMI [remote-integration] production at [the broadcast center]. Others don’t take all the camera [feeds] back; they just leave EVS operators or the Avid editors at home and use our data pipes to connect them. Or we’re home-running a pregame and postgame show but doing the game traditionally. So there are different iterations of it.

Also, a number of our other clients love the [at-home] concept, but their plant can’t handle all the feeds coming in.

Out of all that came Cumulus. We want to give our customers the ability to home-run, but the feeds never actually get to their facility, which saves on the infrastructure costs of their facility and gives them a lot more flexibility.

How did visitors react to the Cumulus demo at NAB 2017?
We demonstrated how all the different facets of a production could travel across the network. We were pleasantly surprised by how successful it was. We took every single facet we could think of that would be needed onsite [at Dick’s Sporting Goods Park in Denver] in a typical production — even a SkyCam, SMT graphics, and a C360 virtual-reality camera — and tried to replicate them elsewhere.

I think anyone who walked by our booth saw the traffic and the overwhelmingly positive response. There was definitely a lot of interest right from the start of the show.

What are the primary benefits of Cumulus for live-sports broadcasters?
Where does the customer win in this? The customer wins in cost because they do not need to send people onsite; that is typical home-run economics. Customers also get a higher level of control because they can use the same producers, directors, and production people for multiple shows.

They do not need to commit the capital on all the equipment as it changes. That will be our responsibility because it will all sit inside the cloud. And then, you get the economics of the sharing. We have so many clients that will do a game or two a week, and that equipment or truck is out of commission until the next game. Our idea is to put the equipment in the core and have many levels of surfaces that run off of it, so that customers can share that equipment.

Where do you see an opening for Cumulus in the live-sports-production market: the high-end or the low-end games?
I think what people are focused on right now [for at-home production] are their D- and E-level games for economic reasons. But we also provide big data pipes on the NFL, NBA, and MLB A games, as well as for PGA and NASCAR. I see the business coming together: we’re coming from the A games down, and we’re coming from the E games up. I think what everybody was taken aback by at NAB was that this can work for anything.

How do you see Cumulus being used on high-profile A-game–type events?
Let’s theoretically look at an event like the PGA Championship, for example. At the PGA Championship, it used to be that you would have just the linear feed come out for CBS. Now you have a featured group, plus featured three holes, plus multiple shows on Turner, plus [international rightsholders] like Sky and NHK. They are all trying to customize the show off different camera angles on the course. If I can run the camera angles back to our core, those camera angles can be shared across all platforms and all clients, [who] can take those camera angles and cut the shows [they] want: if NHK wants to cut the show to focus on the Japanese players or if Sky wants to follow the European players, they could do it very easily. [And they] don’t need all of those individual production trucks; they could all run off the camera feeds coming in.

Why do you believe that now is the right time to launch Cumulus?
The timing for Cumulus is perfect in my book for a number of reasons.

First, you had to prove to the industry that at-home production worked. We’ve done that over the last couple of years. I think, [according to what was shown at NAB 2017], everybody believes in the concept now. We’re going that way whether we like it or not.

Second, you have an onslaught of over-the-top customers coming into the market: the Twitters, the Googles, the Facebooks, and on down the line. These companies aren’t in the live-production business, but I don’t think anybody here doubts they’re going there. Now, do they have infrastructure? No. Have they bought equipment? No. And I don’t think they even want to know that side of the business. So Cumulus can be a big help for those guys.

Third, we are in a transition right now. Most [broadcasters] can’t handle 4K or 1080 HDR in their plants, but they’re getting pressure to [create that content], which is extremely costly. Nobody really knows if we’re going to go 4K or 1080p/HDR. So we could be doing this for them, and they don’t have to invest capital in infrastructure, which has been slowing down the migration to 4K and HDR. If you want to do big events in 1080p HDR or 4K, you can just do your normal [720p or 1080i] show for the main feed, and then we can take the 4K or the 1080p HDR and [deliver] it to DirecTV or over-the-top. You don’t have to change your entire workflow and infrastructure.

What would the business model look like for an average Cumulus customer? Would they have flexible and scalable use of Cumulus control rooms, or would they commit to a certain number of shows?
There are three ways you can go here.

First, you can walk into one of our centers where we’ll have control rooms — in Los Angeles, New York, London, and Miami — and produce the show.

Second, you can set up a control room in several hours, as we did at NAB. We just need a basic list of needs from the customer, and then we can go in, lay out all the surfaces, displays, and controls, and run it back to your facility.

Lastly, there is what I’m temporarily calling “reverse Cumulus.” [For a client,] like PGA TOUR or college basketball, [with] several hundred events, we are going to build mobile trailers to be onsite that will house the tubs of gear but will not house the [control] surfaces, and the people will not sit in the trailers.. The trailers will have a big I/O panel on the side where they punch down their mics, their cameras, et cetera. All of the hardware [will be] in there, but the surfaces will sit at the client’s facility.

How do you expect the market to react to Cumulus, and what do you expect in the next 12 months?
I believe this is The Switch’s direction moving forward. We have assembled a genuine dream team of talent to work on this, including Glenn Adamo [principal, Ivanhoe Media & Entertainment], Fred Beck [president/director, engineering, Beck TV], and Darrell Wenhardt [president, CBT Systems], and we’ve come up with something that is unique in the industry. We have built the right tools and designed it the right way to create an attractive economic reason for customers to make the migration over to Cumulus.

Unlike many of the other carriers, our network is purpose-built for television. We don’t have contention problems. I think the most important part here is that our network, more than anybody else, can handle this. Because it is a mesh-protected DTM [dynamic synchronous transfer mode] network that isn’t IP, it is not going to have the latency or contention that others would have. I don’t see any competitors who could lay this over their network; I think it would cause some real headaches. The power of our network is what makes this possible, and we are building another layer on top of the network to support unique production workflows and business cases that is The Switch Cumulus.

We are ready to do this today, and we can customize [solutions] depending on what people want. We are already seeing a lot of interest, and I expect that only to grow.

Password must contain the following:

A lowercase letter

A capital (uppercase) letter

A number

Minimum 8 characters