SVG Sit-Down: EVS’s Sébastien Verlaine on the Move to Web-Based Services

The broadcast-gear manufacturer focuses on an integrated hardware/software-based ecosystem

Sébastien Verlaine, head of marketing and communications, EVS, and the rest of the EVS international team were at IBC with an ecosystem of products and services, both hardware- and software-based, that are more integrated and connected than ever. He sat down with SVG during the show to discuss new offerings, the role of AI and automated production, and much more.

EVS’s Sébastien Verlaine: “It has always been our intention to help the people in the production team to focus more on the creativity and less on the cumbersome tasks.”

What is the key messaging for EVS this year?
Our VIA MAP, which is our Media Asset Platform, is an evolution in terms of our offerings. As you know, we’ve gone from hardware-product–oriented offerings to integrating more and more products, even with third-party systems, so that we could start offering solutions like LiveCeption and MediaCeption. And now we’ve gotten to the point where there’s even more integration between the different solutions and we’re starting to arrive at an ecosystem which we started to talk about a little bit at NAB.

But the word ecosystem can be interpreted in so many ways, and that’s why we felt that it was time to brand it a little bit so that it has more structure and more meaning in the medium to long term. That’s why we focused on the ecosystem around asset management.

How is that changing the way you approach products and services?
We’re arriving at something that’s more web-based and uses different applications that are more and more transversal and connecting the dots. That’s why we felt it was time to introduce what we call the VIA MAP as we bridge the gap between the production and the creation side with the distribution. In some cases, we’re reviewing the way we would develop in terms of silos and are presenting certain features that can be applied across the solutions.

For example, face recognition uses artificial intelligence that can also be used in our MediaCeption line for content management. And it can be used in the MediaHub for content exchange. There are less and less boundaries between products, and the MAP platform responds to those changes. This is something that certain customers are already taking advantage of, but it will continue to evolve with other platforms in the next few years.

A big challenge is that more and more people in the rights chain and the content-creation chain want access to content.
It’s amazing, and there is more content, and that is why AI-based features are important to allow content to be searched and retrieved more easily. Facial recognition is one, including VIP or celebrity recognition that uses databases in the cloud. That accelerates the process of finding, retrieving, repurposing, and delivering, and that is the whole purpose for that technology.

And then, there is speech-to-text AI, where transcripts are generated by AI and also translated into different languages. Finally, there is generative AI like ChatGPT that allows requests in a different way as media managers look to work with and deliver content.

 AI is clearly a focus for a lot of companies at IBC. How do you see its becoming part of your product roadmap?
We have an Innovation Lab, which is a team of data scientists and computer-vision engineers that have been working on artificial intelligence and other technologies for over 10 years. We first started commercializing the integration of AI with our Xeebra video-assisted–referee product for the off-sideline technology and the calibration of the field. That used to take quite a lot of time, and, if something were to happen during the event, you might not be able to use the offside line efficiently. But now that’s done with artificial intelligence, so it speeds up the process, and, in the end, it’s even more reliable.

The second step was XtraMotion, which we introduced a couple of years ago for frame interpolation and being able to generate super-slow-motion content from any camera or video source, whether it’s an archive or live feeds.

But we want to be able to continue to develop certain features that will, on one hand, help facilitate the search and distribution of content and, at the same time, help creatives come up with some captivating content for their live storytelling. That has always been our intention: to help the people in the production team to focus more on the creativity and less on the cumbersome tasks.

Here at IBC, we are showing three features based on AI around replays. One is the cinematic effect: you can focus on a player and blur the background to make it more dramatic. Or we can do the opposite: take out the blur and make it a nice clear image. And there’s the zooming effect: you can use AI to take a wide shot and focus on a specific area and keep the image quality with the push of a button.

Password must contain the following:

A lowercase letter

A capital (uppercase) letter

A number

Minimum 8 characters