Minute Video Platform Drives Engagement via AI, Automation, Crowdsourcing

What fans are talking about on social media helps maximize streamed content in real time

Got a minute? That is the guiding philosophy for Minute, a company that, since 2013, has combined crowdsourcing what fans are talking about on social media with computer vision to automatically create clips designed to drive online users to get more engaged with content, whether it be tuning into a live broadcast, watching highlights, or other actions.

Minute’s Amit Golan says his company’s video platform can drive viewer engagement of live feeds.

“We combine those two things together to help determine what are the most interesting moments that people would like to see,” says Minute CEO/founder Amit Golan. “We aren’t showing a highlight but a teaser. If there is an amazing goal by Ronaldo, we will take the goal, chop up the highlight, and create a montage that is a short preview of the longer clip. Then, the viewer clicks and is taken to the full clip or live stream.”

Minute has been in growth mode, doubling its size in the past 12 months, and now has offices in New York City and Tel Aviv, with other locations to be added soon. It was founded in 2013 out of a desire to solve some key issues facing the streaming-video market, not the least of which was profitability for those doing the streaming. Minute has a SAS model that involves either a flat fee or a revenue share with clients. The goal is to be flexible when it comes to getting clients on board.

“At the end of the day,” says Golan, “we want to help maximize the content, as today the ROI on video content is not in the positive. But users are most interested in consuming video. And that is where we began four years ago, asking how we can increase engagement.”

Minute was originally a video-discovery app, connecting users with the most relevant content they were looking for, and that technology is the primary method for crowdsourcing what users want. What distinguishes Minute’s platform is that it uses massive quantities of data acquired through extensive A/B testing done when the company was in its original form as a video-discovery app. Using that data, Minute developed its proprietary AI technology, which extracts the most compelling moments from any video, automatically creating a five-second video teaser to be used in place of a static thumbnail on a webpage.

“We wanted to understand the psychology behind the users and learn what [impels] them to click and watch more,” Golan says. “We had a lot of assumptions and theories, but we created new video technology that can analyze the video and detect the most interesting moments in the video. Once you have that, you can do all kinds of things: create a summary of the game, teasers for the game, and other things.”

Using video clips to drive tune-in is one of Minute’s main goals, and the approach was deployed by a prominent global broadcaster for the World Cup last year. At the core of the technology is an algorithm that calculates and integrates key parameters from numerous metrics and video layers into five-second auto-preview video trailers. The goal is to give publishers working with a Dashboard control-panel interface the ability to more quickly publish videos and replace static clips, which can get old quickly in a world where what is happening now is most important. Publishers can also see click-through rates and video-engagement statistics.

“Viewers can tell if something isn’t a real-time teaser, and they want to see something that is happening right now as they are much more interested in the current results,” Golan notes. “We have developed most of our tools in-house, but we also use other tools to help improve the system all the time.”

The system works by accessing the live stream from the client and analyzing it in real time. That analysis includes using machine learning and AI to identify events within the live stream and joining that information with other information to create smarter, more relevant clips.

“Every minute, we can generate a teaser clip from the previous minute and then stream it to wherever the user wants,” says Golan. “For example, viewers can enter a site and see a clip of the last minute or two and then be redirected to the live broadcast. Teasers can be extremely impactful.”

With systems like IBM Watson and AI continuing to gain in popularity, the value proposition of Minute’s platform is designed to add another level to what Watson does: subjectivity.

“Machine learning can allow you to find objective things, but what we do is add in the subjective,” he explains. “What Watson and others do is amazing and incredible work, but they are looking for what machine learning can do.”

That is where the crowdsourcing comes in, allowing objective information related to a clip to be tied to what is resonating on social media with viewers and fans. And sometimes it is not the obvious event.

“The most interesting moments in soccer are fights,” Golan observes. “An editor might say, ‘This is what I think the audience would like to see,’ but we bring the actual data so that viewers can see the four or five seconds they really want. We are always working on that data to improve all of the things we do, and we are always beta testing.”

The Minute platform also automatically tags and creates logs that allow the content to be searchable by text, an important feature for production teams looking to create additional video elements.

“With crowdsourcing, we know how users are interacting with video and what they think about it,” he says. “We use that information to create better consumption and by analyzing the text, audio, and natural language that surrounds the video.”

Password must contain the following:

A lowercase letter

A capital (uppercase) letter

A number

Minimum 8 characters