Epic Games Releases ‘Hype Chamber’ Sample for Unreal Engine
From wowing sports fans in stadiums to powering fantastical arenas for fully virtual events, real-time technology is increasingly being used to enhance broadcast and live event content. That makes it the perfect time to announce the release of a new Unreal Engine sample for broadcast and live events — The Hype Chamber.
Developed as part of a reimagining of the Rocket League Championship Series (RLCS), the sample illustrates how to design, develop and play out numerous animation elements for an esports show using advanced Blueprint and data table workflows. By diving under the hood, artists will learn how to switch 3D models, textures, materials and lighting, all through a single Blueprint controller. The sample includes several motion graphics animations that have been designed to be played out live or as pre-rendered content.
Back in 2020, Rocket League developer Psyonix announced a new format for the 10th season of RLCS, known as RLCS X. This format did away with league play in favor of teams earning points through three regional splits, all culminating in three seasonal majors.
With the tournament growing in scope year over year, Psyonix had started to face some complex challenges. That drove the team to experiment with a real-time approach to create the broadcasts for the new season.
“Working with Lamborghini, Ford, Verizon, Pele, X-Games and others, we didn’t want to just slap a logo on the broadcast and call it a day,” says Cory Lanier, Esports product manager at Psyonix. “We rebuilt each broadcast package for all of these shows and were running into workflow issues on creating new assets about every two weeks.”
In addition to having various sponsors and themes to work with, the open format of the league means there are usually new teams competing weekly. In traditional pipelines, this would either equate to significant additional work and rendering each time a new team is added, or it would lead to the add-on teams receiving less special treatment than the established teams.
The answer: a flexible, real-time broadcast graphics package brought to life through a collaboration between Psyonix, the Unreal Engine team, Capacity Studios and ESL Gaming.
“With the way we’ve set up the Hype Chamber in Unreal Engine, we can quickly swap out logos and modify color palettes, and instantly have new high-quality assets ready to go for that weekend’s broadcast,” says Ellerey Gave, executive creative director at Capacity Studios.
This has made all the difference to Psyonix, enabling them to handle complexity in a way that wasn’t previously possible. “Now, because of how we’ve templated out everything, anyone on my team—graphic designer or not—can go in, download the build, transition between over 1,000 combinations of assets, and create high-fidelity broadcasts,” says Lanier. “This is huge for us, because we’d change out colors and assets constantly last year—so doing this all in real time as opposed to hard-baking in every single graphic element we create is super efficient.”
The Hype Chamber started out as a motion graphics package that was used to introduce the RLCS teams and matches. A virtual space created within Unreal Engine, the concept was designed as a launching point to reimagine what a sports broadcast might look like for a digital-first audience.
Very quickly, the Rocket League Esports team wanted to take the concept of the Hype Chamber and build it for real—enabling the teams to play their matches inside the virtual environment. This led to the development of a physical studio space that uses real LED screens fed by outputs from the Unreal Engine scene to recreate the environment on stage. Teams are stationed on either side in front of a life-sized in-game Octane, which is often sporting a team skin, with the hallway out to the field in between.
With a virtual Hype Chamber created in Unreal Engine and a physical set built, the team just had to marry the two. Benji Thiem is creative director at Capacity Studios. “Since we developed an entire scene that exists in 360 degrees, we were able to map the portions of the space we wanted to feature onto a set of LED screens, creating a dynamic backdrop for the live event, which already had much of the functionality for team customization built in,” he says. “We further expanded on this package by including custom graphics, as well as a toolkit of video loops that could drive other smaller screens in the space.”
The side-by-side curved screens are skinned with their respective team colors, creating an environmental takeover of team branding. This element flows right into the start of the game, with the camera rotating 180 degrees around to fly out the tunnel that leads to the arena.
The Hype Chamber is also fully decked out in several team-specific elements that represent the team being highlighted in the broadcast, together with LED wall-based text interstitials and victory callouts, as well as a suite of various backgrounds for screen-level content to perpetuate the aesthetic throughout.
One of the coolest categories of deliverables that the Hype Chamber creates is the opportunity for branded sponsor moments, says Gave.
“Rather than the typical sponsor logo sitting over a generic background, which is prevalent across all sports broadcasts, we’re able to flood the LED walls with sponsor colors and skin the iconic Rocket League Octane car with sponsor decals,” he explains. “Or, when it’s an automotive sponsor, we’re able to swap the actual feature vehicles into the Hype Chamber scene which, either way, creates an extremely impactful co-branded moment for all involved, without breaking the high-quality, immersive flow of the broadcast.”
What does all this mean for aficionados of RLCS (and esports audiences in general)? For Jasveer Sidhu, art director at Capacity Studios, the new studio brings fans and the game they love so much closer together. “The existing paradigm for esports broadcasts uses a back-and-forth between in-game footage and more traditional 2D screens, and live action can sometimes feel disconnected,” he says. “The Hype Chamber allows for viewers to feel more immersed by creating a transition space between the real and virtual worlds, where the sport and content of the broadcast are meant to complement each other.”
Gave points out that the virtual Hype Chamber environment can also be leveraged for a multitude of use cases beyond the broadcast itself. “It enables us to create immersive team spaces, which act as a showroom for team car decals that are available for fans to buy and use within Rocket League,” he says. “The setup also has a life outside of the broadcast, acting as a way to promote team decals on social and tie the whole experience together.”
A fan can watch a stream of their favorite RLCS team playing with a cool car decal, then hop on to the game and play with that same asset. It’s the perfect digital handshake—one that points to the types of cross-media experience that will become commonplace in the metaverse era.
The Hype Chamber brings a number of Unreal Engine features to bear alongside the Blueprint visual scripting system. The ability to import Cinema 4D files natively through Datasmith is a key element in serving up the visuals. Sequencer, Unreal Engine’s built-in nonlinear animation editor, plays a key role in helping to design each layout and animation. In addition, the Remote Control API enables control applications to drive the content live during the broadcasts.
Visual features like real-time ray tracing enable the LED display screens to envelop the entire scene in indirect lighting. The team also took advantage of data tables to create a systematic theme designer tool, enabling designers to quickly iterate and modify several artistic elements such as team color schemes and branding, from a central location.
“We successfully managed the look development and imagery of more than 60 teams using this method, while including the extendibility of adding more as needed,” says Warren Drones, a senior product specialist at Epic Games who assisted on the project.
Using Unreal Engine, the team was able to scale from on-air graphics, to video walls, to complex XR stages, all within one software package. “Broadcasts and live events typically use some form of software to enable playback of live 2D graphics,” says Sidhu. “In our opinion, Unreal Engine is the only software solution that enables a high-fidelity 3D visual experience at real-time playback that is highly programmatic and extendable for adapting to existing broadcast pipelines, yet still artist-friendly in nature.”
For Gave, the ability to use real-time technology to produce broadcasts and live events has been an eye-opener. “We’ve been able to experience firsthand how adaptive and responsive the creative process can be using a real-time approach, which is absolutely a breath of fresh air compared to more traditional pipelines we’ve been a part of,” he says.
Now that the Hype Chamber sample is available for the community to experiment with, anybody can explore how Unreal Engine can be used as a motion graphics or broadcasting tool for a multitude of real-time graphics needs.
“It’s super exciting to make the setup we created for RLCS available for folks to check out and take a look under the hood at how everything works,” says Gave. “As an animation and motion graphics studio, the real-time workflow has been a game changer for our pipeline, and we fully believe it’s the future, so we’re hoping that seeing it in action will help demystify the process for others.”