[email protected] Perspectives: Burst CEO McBride on the Future of User-Generated Content for Sports
It has been a busy spring for mobile-video startup Burst, which received equity investments last month from major media players Sinclair and Horizon Media. Although the investments served as a validation of sorts — Burst technology enables broadcasters and media companies to showcase social media and user-generated content (UGC) — the company has worked with a variety of major sports broadcasters for more than a year, including Fox Sports, NESN, TSN, and the Breeders Cup.
The Burst platform focuses on the capture, curation, broadcast, and monetization of UGC and controlled video content, whether it’s sourced from reporters/stringers/social teams or from fans watching the game in the stands, on TV, or on a mobile device.
This week at NAB 2016, Burst is exhibiting at StudioXperience in Booth SL2425 and in Avid’s partner pavilion. SVG sat down with Burst CEO/co-founder Bryant McBride to discuss the Boston-based startup’s future following the Sinclair and Horizon investments, how the platform is changing the way broadcasters interact with their fans, the exponential proliferation of user-generated video in recent years, and how user data and analytics benefit sports-content owners.
How does the Burst platform work for live sports?
Burst is a mobile video platform that allows broadcasters to engage with fans in a different way. Say a broadcaster like Fox or NESN has 30 cameras at a stadium feeding into a truck outside. Essentially, Burst harnesses the capacity of the 2 million, 3 million, or 4 million cameras in people’s pockets as they’re watching the broadcast. We take that huge capture capacity, we ingest it, we algorithm it, we curate it, and then we work very closely with some of the best in the business — like EVS, Avid, Evertz, and other playout systems — to integrate that content. There is always gold somewhere in those 4,000 or 5,000 videos that get submitted in a game, and we find that gold algorithmically using AI and machine-system learning. It is ingested directly into the OB truck or studio workflow in a way that respects the production workflow and the tight timelines that go into the production. We worked hard to pull every bit of friction out of the process to ingest those videos and, in doing so, we’re allowing the broadcaster to make television interactive.
What are some examples of how Burst has been used in sports-specific applications?
We thought what Fox Sports Australia did was brilliant. They had two calls to action on TV, two calls to action on social media, and two on the [stadium videoboard] at the Sydney Derby asking fans to submit video. What they got back from a goal with three minutes left in the game was nine different perspectives of that goal from all over the stadium.
And then they put it in the cloud, where we are curating the videos algorithmically. Anytime there’s more than five pieces of media in that container — or a bubble, as we call it — we instantly create a highlight reel. So they put in those nine different perspectives in with the highlights from the linear feed, and out came this montage that was brilliant. It showed the fans’ perspective, it showed the 30 [broadcast] cameras’ perspective, and it was something that only they could do. They got UGC, they got the linear feed, they put it together, they put a sponsor in front of it and then put it on TV and on social and on the videoboard and on their website — and made money from it almost instantly.
Why have you made partnering with traditional broadcast-technology vendors, such as EVS, such a priority thus far?
We are a TV company. We just happen to work closely with social-media companies because they are becoming TV companies as well. If you judge TV by the traditional metric of eyeballs, Facebook’s live-streaming efforts and Periscope are just different flavors after the same eyeballs. So we are positioning ourselves in this quickly evolving and shifting landscape. We’re long on the mobile device as a capture device for video and for photos. More videos and photos were taken in the last year than in the history of time. That’s not going away. However, we’re also long on TV. TV isn’t dying, TV is morphing. When you look around the halls at NAB, you see this industry is booming. Integrating into the existing systems, in my mind, is mandatory, and it’s critical to gain trust so that, when it does change further, you’ve built a rapport. In the end, our goal is to help [content owners] serve video with speed and precision to whichever endpoint the consumer wants it, and we need great partners to make that happen.
How do you foresee social live-streaming platforms like Facebook Live and Periscope impacting your business and sports-related user-generated content as a whole?
People are consuming video in record numbers on all devices, and that’s good for everyone. I liken Periscope, Facebook Live, and those platforms to building cars. We’re all building cars — the one thing that we have in common is that we all have an engine — all for different purposes. On one end of the spectrum, you never know what you’re going to get on a live Periscope feed, which is a little scary, and broadcasters and digital-media leaders need control for regulatory reasons. On the other end, TV has to be entertaining. So we want to be able to package content and serve engaging content but also provide our [customers] a sense of control.
How must national and regional sports networks adapt to the growing consumption of their content on multiple devices and constant social-media engagement?
We see the need for deeper personalization, deeper contextual offerings that anticipate what people want and give them just what they want. Even when we went from a four- to a 500-channel world, we went overboard, right? Of those 500 channels, 35 were profitable, and only about 12 were viewed per household on a regular basis. We’re now a billion-channel world. If you think of all the Facebook feeds, Snapchat stories, those are channels, and they are very personalized and very individual, making this a billion-channel world, so you better be bringing something really interesting. Traditional TV is not going away. People are going to tune into their RSN, and they’re going to watch the game, but that RSN has got to do something that is above and beyond [what the network] worked years to build. A year from now, you’ll see us roll out across the country with our lead investor, Sinclair Media, and provide some opportunities that we fully expect to stretch some minds.
How can Burst be used as a live-production tool or ENG tool to cover action on the field?
Say there is a high school football game on Friday night and a local broadcast station sends a reporter who is there with a template on Burst. It connects directly to channel 7, and he or she shoots an establishing video, a coach interview, the star-player interview, some game action, halftime score on the scoreboard, a little more game action highlighting the touchdowns, tagging them quickly so that, at the end of the day, they have turned a dozen videos into highlight reels curated within five minutes after the game, and that gets pushed to channel 7 to do with it what they want. We can do that today. [For] all of the 50 games being shown in the Boston area, through the data, we can just identify touchdowns, and all the touchdowns will come together. You will see the top 10 highlights, the top 10 plays in that market sponsored by Gatorade or PowerAde or whoever. It’s a sponsorable element that broadcast shows can use on a traditional linear feed and also push to their social destinations and in a targeted e-mail or through their app.