GDC 2023: Epic Games Launches Unreal Editor for Fortnite, New Animation and World-building Tools

Epic Games is set to showcase their Unreal editor used in Fortnite as well as other new tools used for world building at this years GDC.

Unreal Editor for Fortnite now in Public Beta

Since 2018, it’s been possible for a user to create their own island in Fortnite thanks to the Fortnite Creative toolset. Today, there are more than one million of these islands and over 40% of player time in Fortnite is spent in them.

What if creators and developers had more powerful tools and greater creative flexibility to reach Fortnite’s huge audience of more than 500 million player accounts? That becomes possible with Unreal Editor for Fortnite (UEFN), launched in Public Beta at the State of Unreal during GDC 2023.

UEFN is a version of Unreal Editor that can create and publish experiences directly to Fortnite. With many of Unreal Engine 5’s features at the user’s fingertips, creators and developers have a whole world of new creative options for producing games and experiences that can be enjoyed by millions of Fortnite players.

What’s more, UEFN provides an opportunity to use the new programming language Verse for the first time. Aimed at getting UEFN creators up and running with the ability to script alongside existing Fortnite Creative tools, Verse offers customization capabilities such as manipulating or chaining together devices and the ability to easily create new game logic. Verse is being designed as a programming language for the metaverse, with upcoming features to enable future scalability to vast open worlds built by millions of creators for billions of players. Verse is launching in Fortnite today, and will come to all Unreal Engine users a couple years down the road.

At the State of Unreal, the company also showed a brand new GDC demo that tests the limits of what can be built with UEFN today. The UEFN demo showcases a variety of key UEFN features including Niagara, Verse, Sequencer, Control Rig, custom assets, existing Creative devices, and custom audio.

The demo includes three key parts: an opening section highlighting how to enhance existing Fortnite Creative devices using Verse, a deeper dive into the editor, and an exciting boss battle to close out the segment. A demo will be shown of live running on Fortnite servers, and played it on PC.

Creator Economy 2.0

UEFN is being launched alongside Creator Economy 2.0, and in particular, engagement payouts—a new way for eligible Fortnite island creators to receive money based on engagement with their published content.

Engagement payouts proportionally distribute 40% of the net revenue from Fortnite’s Item Shop and most real-money Fortnite purchases to the creators of eligible islands and experiences, both islands from independent creators and Epic’s own such as Battle Royale.

A unified 3D marketplace

In the old days, every game developer built all of the content in their product from the ground up. Increasingly, content marketplaces such as Unreal Engine Marketplace and Unity’s Asset Store have provided huge libraries of content which game developers can license from independent content creators and use in their games.

This enables small teams to produce games more quickly and with a lower budget, while funding the growth of independent content creators distributing their work through marketplaces.

Seeing as this trend may grow significantly as creators of experiences across Fortnite, Roblox, Minecraft and other 3D worlds look to these marketplaces as sources of metaverse content, and as players increasingly build out and customize their own 3D spaces online.

Later this year, marketplaces are being brought together—Unreal Engine Marketplace, Sketchfab, Quixel Bridge, and the ArtStation Marketplace—to launch Fab: a unified marketplace where creators can find, publish, and share digital assets for use in creating digital experiences.

Fab will bring together a massive community where creators will earn an 88% revenue share, and the marketplace will host all types of digital content including 3D models, materials, sound, VFX, digital humans, and more; supporting all engines, all metaverse-inspired games which support imported content, and the most popular digital content creation packages.

A peek at what’s coming in Unreal Engine 5.2

Unreal Engine is the backbone of the Epic ecosystem. The launch of UE5 last spring put even more creative power in the hands of developers. Since that release, 77% of users have moved over to Unreal Engine 5. Unreal Engine 5.2 offers further refinement and optimizations alongside several new features.

Electric Dreams is a real-time demonstration that showcases new features and updates coming as part of the 5.2 Preview release, available today via the Epic Games Launcher and GitHub.

In the live demo at the State of Unreal, a photorealistic Rivian R1T all-electric truck crawls off road through a lifelike environment densely populated with trees and lush vegetation built with Quixel Megascans, and towering craggy rock structures using Quixel MegaAssemblies.

The R1T’s distinct exterior comes to life in the demo thanks to the new Substrate shading system, which gives artists the freedom to compose and layer different shading models to achieve levels of fidelity and quality not possible before in real time. Substrate is shipping in 5.2 as Experimental.

The state-of-the-art R1T showcases the latest technology and vehicle physics, with the truck’s digi-double exhibiting precise tire deformation as it bounds over rocks and roots, and true-to-life independent air suspension that softens as it splashes through mud and puddles with realistic fluid simulation and water rendering.

In addition, the large open world environment is built using procedural tools created by artists that build on top of a small, hand-placed environment where the adventure begins. Shipping as Experimental in 5.2, new in-editor and run-time Procedural Content Generation (PCG) tools enable artists to define rules and parameters to quickly populate expansive, highly detailed spaces that are art-directable in manner and work seamlessly with hand-placed content.

MetaHuman Animator: high-fidelity performance capture

Photorealistic digital humans need high-quality animation to give truly believable performances however, and the expertise and time to create this has previously been challenging for even the most skilled creators.

That’s all going to change this summer with the launch of MetaHuman Animator—a new feature set for the MetaHuman framework. MetaHuman Animator will enable the user to reproduce any facial performance as high-fidelity animation on MetaHuman characters.

Simply use the iPhone or stereo helmet-mounted cameras to capture the individuality, realism, and fidelity of the performance, transferring its detail and nuance onto any MetaHuman.

Offering a faster, simpler workflow that anyone can pick up and use—regardless of animation experience—MetaHuman Animator can produce the quality of facial animation required by AAA game developers and Hollywood filmmakers, while at the same time being accessible to indie studios and even hobbyists.

Password must contain the following:

A lowercase letter

A capital (uppercase) letter

A number

Minimum 8 characters