IBC 2017

IBC Reflections: Adobe’s Laura Williams Argilla on the Future of VR/360 Postproduction, Power of Collaborative Workflows

Also, investment in AI aims to make information discoverable and tedious tasks speedier

Adobe used IBC 2017 to tease features and upgrades that will debut in Premier Pro CC 2018. New tools previewed for VR/360-video production included Adobe Immersive Environment and the integration of recently acquired Mettle Skybox plug-ins. In addition, Adobe continues to boost collaborative workflows in Creative Cloud, adding the ability for multiple editors to access a single project simultaneously, the continued maturation of its Team Projects cloud-based collaboration service, and the introduction of a Shared Projects onsite collaboration service. Adobe is fully aboard the artificial-intelligence bandwagon, adding features to Creative Cloud platforms powered by Adobe Sensei, the company’s AI and machine-learning framework.

SVG sat down with Laura Williams Argilla, director, product management, Adobe Creative Cloud Video, to discuss the forthcoming Premier Pro CC features, what else users can expect when the latest version is rolled out next year, and how multi-site collaborative workflows are dramatically changing the way sports-production teams function.

Laura Williams Argilla, director, product management, Adobe Creative Cloud Video

What is the focus for Adobe in terms of the Creative Cloud?
Our theme at IBC this year is “Creating Time,” because, as we talk to our professional users and the industry at large, people are being asked to create more and more media in less and less time. While we have not quite figured out how to add more hours to the day, we can help solve these problems by providing new creative tools that allow [users] to do things that used to be expensive and time-consuming. So we focused a lot on this release, making things more seamless and more productive, in innovative ways.

As VR and 360-video production continues to grow, what is Adobe adding into Premier — and the Creative Cloud platform overall — to serve these needs?
Thus far, VR has been a technology where the workflows aren’t always there yet. In the past, [Creative Cloud] has been used as kind of a center for VR creation, so we’ve invested a lot in [VR functionality] during this [development] cycle.

Our acquisition of Mettle Skybox [VR/360 plug-ins] is now incorporated, and most of those features are integrated directly into the applications so that they behave in the way that customers expect to edit in Premiere. That will make it a really seamless process for users: if you are a 2D editor using Premiere but you’re working on a VR project, it feels native to you; it doesn’t feel like it’s a completely foreign beast. And we think that that friction point helps “create time.”

One of the other really exciting things that we’re showing is the Adobe Immersive Environment. This is one of my favorite things because, when you work with VR, most editors are viewing it on a flat screen and you’re seeing a “magic window” of what you would see through the [VR] headset but it’s not truly immersive. So editors end up having to put on a headset, see how it looks, take it off, work more on the project, and then repeat. It’s disorienting and easily distracting.

With the Adobe Immersive Environment, we’ve started to represent the editing environment inside the headset. It’s not full editing controls because we don’t think people want to edit in the headset, but it gives you visual cues about where you are in your project and allows you to start having that deeper connection with what you’re editing, not just what the experience is. That starts to make the whole process faster and make those decisions better.

We are really excited to show that on the floor because it’s the first of its kind, and I think it’s something people are really going to enjoy using. It transforms the way people edit VR content.

What else is new in the Adobe Premiere Pro reveal at IBC 2017?
We’ve made a ton of new investments in Premier. One that people have been most excited about in this reveal is the ability for multiple editors to access a single project simultaneously. Since the 1.0 version, Premier has handled one open project at a time. And, while we’ve had methods to bring in content and sequences from other projects, it wasn’t as elegant as people were looking for. In this release, we’ve added the capacity to have multiple projects open at the same time, which means you can sift through your sequences and bins and get your content exactly the way you want it quicker.

Team Projects also continues to improve and become an even more solid offering, and the “beta” tag will be shed sometime very soon, which is obviously very exciting for us. Some of our beta customers have been college sports organizations, and we see a real future there. It gives you more visibility into who is editing with you, and it improves the version-control [function] to make it more visible right in the UI.

We are also adding the ability to do Shared Projects. So, instead of Team Projects, where it’s cloud-based editing, these are Shared Projects within an onsite environment. Shared Projects allows you to lock a project when you’re working on it, then unlock it to notify someone else they can work on that project. This is how we see people working in serial collaboration, specifically in Hollywood; it’s a big feature request there, so we’re really excited to roll that out.

In the half decade Creative Cloud has been around, Adobe has dramatically boosted production teams’ multi-site collaborative-workflow capabilities. How are you seeing collaborative workflows evolve in the sports-production market specifically?
It really comes down to how your organization works, and, let’s face it, sports organizations are often large and have to create a lot of content from different [locations]. Some organizations aren’t comfortable having cloud-connected systems like Team Projects, and, for those, the serial collaboration of Team Projects will work beautifully. But, for most people in sports, I think the elegance and fluidity of Team Projects is a huge win. The ability to have collaborators using not only Premiere but also After Effects and Prelude makes media production faster, more seamless. With the notifications that someone’s completed and what’s been done in each copy, it can make even remote teams work more effectively together. And, when turnaround times are tight, like they always are in sports, that can really be the difference.

One of the big themes at IBC this year is the rise of artificial intelligence and machine learning. How is Adobe leveraging its Sensei Platform to integrate AI into Creative Cloud?
Adobe’s making a big investment in using artificial intelligence to make information discoverable, to make tedious tasks easier, improve the user, and give users power to do more in less time. That is playing out here at the show in a couple of different ways.

We’ve added an auto-ducking feature in Audition, which allows you to quickly align or adjust the volume of your audio bed where there is voiceover automatically. One of the other places it’s showing up is in Character Animator, which is a tool that allows you to quickly and easily create 2D animation. The Simpsons used it recently on an episode where Homer Simpson was answering live questions from the audience at the end. We have added the ability to automatically respond to face-tracking information to get mouth shape and lip sync correct.