SVG Tech Insight: Immersive Media Experiences With Intel True View Delivers New Reality for Sports
This fall SVG will be presenting a series of White Papers covering the latest advancements and trends in sports-production technology. The full series of SVG’s Tech Insight White Papers can be found in the SVG Fall SportsTech Journal HERE.
With advancements in volumetric media technology, Intel Sports is paving the way to deliver unique and compelling immersive media experiences to sports fans. By harnessing the power of volumetric video, Intel Sports provides leagues, teams, and broadcasters with new storytelling capabilities to engage fans through interactivity, personalization, and unbounded perspectives of the game.
Stadiums filled with fans in their seats, sitting side by side, watching their favorite players and following the play up-close is a fan experience that was once commonplace, but now is vulnerable due to the coronavirus pandemic. With fans at home, social platforms have seen higher engagement and the demand for streaming content has increased as fans look to stay informed, connected and eagerly wait for sports’ full return.
Intel Sports with its Intel True View Platform enables fans from anyplace to experience the game from a new perspective and allows for safe, large-scale remote production for broadcasters and teams, meeting the unexpected challenges of an unprecedented time and evolving fan behavior.
Experiences enabled by volumetric capture
Immersive media is a form of media that includes non-traditional media formats such as 360-degree video, virtual reality (VR), augmented reality (AR), mixed reality (MR), and other emerging technology platforms.
The Intel True View Platform enables sports fans to choose where and how they want to consume the next generation of sports content, including options like three degrees of freedom (3DoF), three degrees of freedom with some limited head movement (3DoF+), or full six degrees of freedom (6DoF) immersive media experiences.
Powered by Intel True View technology, a multitude of immersive media experiences can be created using a volumetric model of action happening on the field.
- Enhanced storytelling with Virtual Cameras: Virtual cameras can follow the ball or players of interest and look at the action from any point-of-view, create virtual sky-cams from any location in the field, or watch the action from a referee’s point of view.
- Create AR experiences: AR experiences can be created where mobile devices (phones/tablets) can project the on-field action onto a tabletop and allow a user to navigate around the game action.
- Create VR experiences: VR experiences can be created for users ranging from 3DoF to 6DoF, placing users virtually in the middle of the action.
- Enable influencers to tell new stories: Influencers (players, commentators, analysts, etc.) can view the game with unlimited camera angles and unique perspectives to tell new stories to their fan-base.
In order to create the experiences described, Intel Sports utilizes a combination of cloud technologies and traditional storytelling techniques in a remote setting away from the venue.
Execution, Creation, AND Delivery of Immersive Media Experiences
The Intel True View Platform and workflow starts with a venue capture system comprised of a camera array built into the perimeter of the stadium as shown in Figure 1. High-resolution cameras are angled to capture the entire field of play and the camera array is connected by fiber to dedicated on-site Intel Xeon-based servers. The data is then uploaded to a media processing pipeline that stores, synchronizes, analyzes, and processes the data in the cloud.
The core immersive media processing and experiences pipeline components, as shown in the Intel True View Platform end-to-end workflow in Figure 2, are all hosted in the cloud. These cloud workloads generate massive amounts of volumetric data (up to 200 Terabytes of raw data per match) in the form of voxels, which capture height, width, depth, and relative attributes that are needed for point cloud formation. Volumetric video created from the point cloud generates volumetric data that is rendered into high-fidelity 3D video in the form of virtual camera videos.
In the remote production site, a variety of production tools allow creative producers, broadcasters, and the Intel Sports production team to create volumetric content focused on the most action-packed or analysis worthy parts of the game. The virtual camera tool allows a producer to define and place a variety of virtual cameras throughout the field including stationary, rail, and tracking cameras. This can track objects such as the ball or players during the live game, while traditional production tools continue to allow producers to curate and enhance content with graphics, audio, and telestration, as is common practice when creating highly produced content.
Back in the cloud, the video renderer creates and renders the series of images based on virtual cameras defined by the virtual camera tool operator. These virtual camera streams are then sent to the video encoder software, which converts the uncompressed videos into a compressed digital video format. The system is designed to support the most common industry standard video codecs H.264 (AVC), H.265 (HEVC), and M-JPEG and AAC for audio codec, in order to support a wide range of client platforms and devices. The stream packager takes the encoded videos and converts the memory bits into consumable bitstreams and prepares the content for live streaming using Transport Stream (TS) and MP4 file format for HLS (HTTP Live Streaming) distribution.
In order to enable business partners to control which video streams are outputted to immersive media streaming applications, the content packaging tool and the content management system (CMS) are provided. The content packaging tool is part of the final leg of creative content creation where content can be selected, organized, and assembled into a final game package. The CMS enables control on when content is made available to clients. It stores the location of the content distribution network (CDN) streams and manages the output streams shown to end users. The CDN serves as the distribution component of the pipeline that takes games packages and streams the content to client players.
On the client side, the stream de-packager takes and reads from the consumable bitstreams and converts it into a format that the decoder can understand. The client player’s video decoder takes the compressed video, then decompresses the video into a series of images. The video renderer takes the series of images and renders them sequentially into what is seen on an end users’ device. And finally, the client application can be designed to provide a variety of experiences enabled by SDKs.
Leading a new media format
As fans hunger for the return of their favorite teams and players, the demand for sports and sports-related content has never been greater. The Intel True View Platform not only meets the needs of fans and storytellers during the coronavirus pandemic, but also sets the stage for a new reality of how sports content will be consumed now and in the future. Applying the latest technical advances in volumetric content creation and deep knowledge of operating and deploying large-scale systems, the platform enables fans to be virtually present in the stadium or field, alongside with their favorite teams and players. Broadcasters, teams, and production crews are enabled to perform large-scale production and deliver unique content and storytelling via virtual cameras capturing perspectives that cannot be captured by physical cameras, all safely without needing to send camera crews into the venue.
The Intel True View Platform empowers producers and content creators to focus on unique creative storytelling with new immersive media experiences that puts their fans right in the center of the game.