Understanding the Differences Between Experimental and Early Access Features

As Unreal Engine continually evolves, we often release ‘Experimental’ and ‘Early Access’ features for developers to explore. As these names imply, these features are not yet production ready, but provide an opportunity to experience – and contribute to…

Unreal Engine 4.21 Released

What’s New

Unreal Engine 4.21 continues our relentless pursuit of greater efficiency, performance, and stability for every project on any platform. We made it easier to work smarter and create faster because we want your imagination to be the only limit when using our tools. And we battle-tested the engine on every platform until it met our developers’ high standards so your project will shine once it is ready for the masses.

We are always looking for ways to streamline everyday tasks so developers can focus on creating meaningful, exciting, and engaging experiences. Our industry-leading Niagara effects toolset is now even more powerful and easier to use, enabling you to dream up the next generation of real-time visual effects. You can build multiplayer experiences on a scale not previously possible using the now production-ready Replication Graph functionality. Iterate faster thanks to optimizations with up to a 60% speed increase when cooking content, run automated tests to find issues using the new Gauntlet automation framework, and speed up your day-to-day workflows with usability improvements to the Animation system, Blueprint Visual Scripting, Sequencer, and more.

We strive to make it possible for your creations to be enjoyed as you intended by everyone, everywhere regardless of the form factor they choose. Building on the previous release, we have added even more optimizations developed for Fortnite on Android and iOS to further improve the process for developing for mobile devices. Available in Early Access, Pixel Streaming opens a whole new avenue to deploy apps in a web browser with no barrier to entry and no compromise on rendering quality. We have also improved support for Linux as well as augmented, virtual, and mixed reality devices.

In addition to all of the updates from Epic, this release includes 121 improvements submitted by the incredible community of Unreal Engine developers on GitHub! Thanks to each of these contributors to Unreal Engine 4.21:

Aaron Franke (aaronfranke), Adam Rehn (adamrehn), Adrian Courrèges (acourreges), aladenberger, Alan Liu (PicaroonX), Cengiz Terzibas (yaakuro), Cerwi, Chris Conway (Koderz), cmp-, Damianno19, Deep Silver Dambuster Studios (DSDambuster), Dorgon Chang (dorgonman), DSCriErr, Dzuelu, Eliot (BillEliot), Erik Dubbelboer (erikdubbelboer), fieldsJacksonG, Franco Pulido (francoap), Frank Stähr (phisigma), George Erfesoglou (nonlin), Hao Wang (haowang1013), Henri Hyyryläinen (hhyyrylainen), Homer D Xing (homerhsing), IlinAleksey, Jacob Nelson (JacobNelsonGames), Jerry Richards (r2d2Proton), Jesse Fish (erebuswolf), Josef Gluyas (Josef-CL), Joshua Harrington (thejhnz), Kalle Hämäläinen (kallehamalainen), KelbyG, Layla (aylaylay), LizardThief, Lucas Wall (lucaswall), Mai Lavelle (maiself), malavon, Marat Radchenko (slonopotamus), Marat Yakupov (moadib), Marco Antonio Alvarez (surakin), Markus Breyer (pluranium), marshal-it, Martin Gerhardy (mgerhardy), Mathias Hübscher (user37337), Michael Kösel (TheCodez), Morva Kristóf (KristofMorva), Muhammad A. Moniem (mamoniem), Nick Edwards (nedwardsnae), nrd2001, Oliver (oliverzx), phoenxin, projectgheist, Rene Rivera (grafikrobot), Rick Yorgason (Skrapion), Riley Labrecque (rlabrecque), Sahil Dhanju (Vatyx), Sam Bonifacio (Acren), scahp, Sébastien Rombauts (SRombauts), Tom Kneiphof (tomix1024), Troy Tavis (ttavis), Truthkey, UristMcRainmaker, Wiktor Lawski (wlawski), yhase7, Zeno Ahn (zenoengine)

Major Features

Niagara Platform Support and Usability Improvements

In our continuing effort to provide industry-leading effects tools, Niagara has received an expanded feature set, substantial quality of life improvements, and Niagara effects are now supported on Nintendo Switch!

GPU-Only Texture Sampling in Niagara

You can now sample a 2D texture or a pseudo-volume 2D texture in your particle scripts! Create amazing effects such as rendering the scene’s depth, color and normal information using a Scene Capture Actor and use that to reconstruct the environment within a Niagara particle system with the particles’ potential and kinetic energy visualized as emissive light.

Check out the Niagara level in the Content Examples project to see how this feature works!

Niagara Skeletal Mesh Data Interface Improvements

There are new functions you can use in the Skeletal Mesh Data Interface enabling direct sampling of a Skeletal Mesh’s vertex data as well as access to specific Bones or Sockets on the Skeletal Mesh.

Ribbon Particle Performance Improvements

Ribbons now generate the ribbon geometry on the GPU instead of the CPU, improving overall performance.

GPU Simulation Support in Niagara

GPU simulation of Niagara effects is now supported on all non-mobile platforms.

Simplified System and Emitter Creation

Niagara now includes friendly dialogs that make creating systems and emitters easier than ever! You can create new emitters and systems from a curated set of templates to speed up development and ensure best practices.






Pendulum Constraint

This constraint solves with physics forces, optional spring drivers, and includes potential energy calculation. You can now create exciting, dynamic effects such as spawning particles with inherited velocity if the energy exceeds a specified threshold:

Module Additions and Improvements

  • Generate and receive death events
  • Now factoring mass into multiple modules
  • New SampleSkeletalMeshSkeleton, SampleSkeletalMeshSurface, SkeletalMeshSkeletonLocation and SkeletalMeshSurfaceLocation modules to complement enhancements to the Skeletal Mesh Data Interface
  • New AddVelocityInCone module
  • New Force modules: FindKineticAndPotentialEnergy, GravityForce, SpringForce and multiple usability tweaks to other forces
  • New KillParticlesInVolume module
  • New SpriteRotatationRate module
  • New RecreateCameraProjection module for using render targets and camera transforms to turn scene captures into deformable particle systems
  • New modules for sampling textures: SamplePseudoVolumeTexture, SampleTexture, SubUV_TextureSample, and WorldAlignedTextureSample
  • New utility modules for temporal interpolation and frame counters
  • Many new dynamic inputs and functions

New: Replication Graph

The Replication Graph Plugin makes it possible to customize network replication in order to build large-scale multiplayer games that would not be viable with traditional replication strategies. For example, Epic’s own Fortnite Battle Royale starts each match with 100 players and roughly 50,000 replicated actors. If each replicated actor were to determine whether or not it should update across each client connection, the impact on the server’s CPU performance would be prohibitive.

The Replication Graph Plugin solves this problem by offering an alternate strategy geared specifically for high-volume multiplayer games. This works by assigning actors to Replication Nodes, which store precalculated information that clients can use to retrieve lists of actors that need to be updated, saving the CPU of recalculating the same data for many clients on every frame. In addition to the standard nodes that ship with the Engine, developers can write their own nodes to fit the specific needs of actors within their games.

New: Optimizations for Shipping on Mobile Platforms

The mobile development process gets even better thanks to all of the mobile optimizations that were developed for Fortnite’s initial release on Android, in addition to all of the iOS improvements from our ongoing updates!

Improved Vulkan Support on Android

With the help of Samsung, Unreal Engine 4.21 includes all of the Vulkan engineering and optimization work that was done to help ship Fortnite on the Samsung Galaxy Note 9 and is 100% feature compatible with OpenGL ES 3.1. Projects that utilize Vulkan can run up to 20% faster than the same project that uses OpenGL ES.

Config Rules System for Android

The Android Config Rules system can now be used to help catch issues very early in a project start up process. This tool allows for quickly checking for device support and providing either a warning or error dialog to the user if there are issues discovered, such as an out of date driver or unsupported GPU. Any variables set may be queried later in C++ with FAndroidMisc::GetConfigRulesVariable(TEXT(“variablename”)).

To use this system, a configrules.txt rules file is optionally placed in your project’s Build/Android directory and UPL is used to add a Gradle task to use the ConfigRulesTool to compress and optionally encrypt it during packaging the APK. More details can be found in the engine documentation.

Program Binary Cache for Android

The Program Binary cache can be used to help improve Shader loading performance and also reduce hitching due to Shader loading on Android devices. The Program Binary cache works by generating optimized binary representations of Shader programs on the device which are used when loading shaders during subsequent runs. Loading Shader programs from optimized binaries can also dramatically decrease Shader loading times. The Program Binary cache must be used in conjunction with the Shader Pipeline cache tool as it will populate the Program Binary cache during the initial run of your application. To enable the Program Binary cache in your project, you will need to add the following command to your AndroidEngine.ini or Device Profile.

  • r.ProgramBinaryCache.Enable=1

Note: Some devices do not support the required program binary extension, such devices will fallback to the previous behavior.

Emulated Uniform Buffers on Android

You can now use Emulated Uniform Buffers for projects that target the OpenGL ES3.1 feature level, significantly reducing memory usage and improving rendering performance depending on your application complexity. Emulated Uniform Buffers have also been optimized to help reduce the size of the data that needs to be transferred to the GPU when your project is being packaged. To enable Emulated Uniform Buffers when using the OpenGL ES3.1 feature level, add the following line to your project’s DefaultEngine.ini under [/Script/Engine.RendererSettings]:


CPU Thread Affinity Control on Android

The ConfigRules system can register whether or not to use a supplied little core affinity mask. If enabled, the following threads will use little cores which improves battery life and evens out performance since they won’t switch between big and little cores causing possible hitches: render, pool, taskgraph, stats, taskgraph background, async loading. For details on how to set this up, see the Config Rules documentation.

Improved GPU Particle Simulation Performance on Mobile

Mobile particle effects that utilize the GPU for particle simulation have been significantly improved. You now have the option of reducing the memory usage for GPU particle simulation by limiting the maximum number of simulated particles that can be used. By default the maximum number of GPU particles that can be simultaneously simulated is set to around one million particles which will use around 32 MB of memory. You can adjust the maximum number of particles to use by adding the following code to your project’s DefaultEngine.ini under [/Script/Engine.RendererSettings]:

  • Setting the value from 512 to 256 will reduce the memory footprint to around 8 MB.
  • The SimulationTextureSize size has to be a power of 2.
  • These improvements are especially apparent on devices that use the ARM Mali GPU.

Dithered LOD Transitions

Dithered LOD transitions are now supported on mobile platforms. When enabled, objects with Materials that have Dithered LOD transitions option enabled will now fade from one Level Of Detail (LOD) to another in an almost seamless manner. By default support for Dithered LOD transitions is disabled for mobile platforms. To enable it, go to Project Settings > Rendering > Mobile and then check the Allow Dithered LOD Transitions option.

Note: Materials that have Dithered LOD transitions enabled will be rendered as Masked Materials. This could have a negative performance impact on mobile platforms. We recommend enabling this effect only on Masked Materials.

New: Cooker Performance

The cooking process has been optimized resulting in up to 60% reductions in cook times! Low-level code now avoids performing unnecessary file system operations, and cooker timers have been streamlined. Handling of unsolicited Assets (with regard to Asset dependencies) has also been refactored to scale better. These changes are most pronounced on larger projects (projects with more than 100,000 Assets).

New: Pixel Streaming (Early Access)

Run a packaged Unreal Engine application on a desktop PC in the cloud, and stream the viewport directly to any modern web browser on any platform! Get the highest-quality rendering in lightweight apps, even on mobile devices, with zero download, zero install.

A viewport rendered by Unreal Engine, embedded within a web UI. Images and models courtesy of McLaren.

You can broadcast a single game session to multiple viewers by simply sharing a link, or send each connecting user to their own separate game session.

The web page that hosts the Engine viewport automatically sends keyboard, mouse and touch events back to the Engine. You can also customize the page with your own HTML5 UIs, using custom JavaScript events to trigger and respond to gameplay events over the wire.

For details, see Pixel Streaming .

New: Animation System Optimizations and Improvements

The Animation System continues to build on its best-in-class features thanks to new workflow improvements, better surfacing of information, new tools, and more!

Animation Compression Updates

Animation Compression times are significantly reduced by using a whitelist of optimal codecs to avoid trying permutations that are unlikely to be selected which greatly reduces the number of codecs we attempt to compress with. On multicore systems, most of the codecs now evaluate in parallel during automatic compression, further reducing the time it takes to compress an animation sequence.

The following updates were made to the Animation Compression Stat Dialog window:

  • Fixed bugs that would cause dialog to show incorrect results
  • Added compression time stat
  • Added number of compressed animations
  • Added tracking for animation with largest average error
  • Added tracking of worst 10 items instead of just worse
  • Better labeling on dialog
  • Pass through FBoneData array more instead of recalculating

Please see Compression for more information.

Animation Notify Improvements

New Animation Notifies have been added that enable you to manage the state of dynamics and cloth simulations. We have also updated Notify add/replace menus to use class pickers for better searching of BP and native notifies. To add a Notify, right-click on a Notifies track, then under Add Notify, select the type of Notify you wish to add.

Please see Animation Notifications (Notifies) for more information.

Maintain Original Scale of Root Motion

Added Use Normalized Root Motion Scale option to maintain the original scale of Root Motion. This option is on by default and is the functionality that existed prior to this Engine release. Choosing to disable this option will now use the final blended animation instead.

Please see Enabling Root Motion for more information.

Added Caching and Autocomplete for “Sync Marker” Names

When creating Sync Markers, you can now access any Sync Markers assigned to the Skeleton from the Existing Sync Markers menu option. Entering text into the search box will also filter out Sync Markers based on your text entry.

Animation Sequence Framerate

The framerate of Animation Sequences is now displayed in the Animation Tools viewport and Content Browser tooltip.

Enable Auto Blend Out on Anim Montages

Anim Montages now have the option to enable or disable Auto Blend Out. This option is enabled by default, however you can disable it which won’t auto blend out the Montage and will keep the last pose.

Please see Montage Properties for more information.

CCDIK Skeletal Control Node

Use the new CCDIK (Cyclic Coordinate Descent Inverse Kinematics) Skeletal Control Node for a lightweight, IK algorithm suited for real-time calculation of relatively short IK chains, such as shoulder to fingertip.

Please see CCDIK Skeletal Control Node for more information.

Set Master Pose Component Force Update

The Set Master Pose Component function has a second input pin called Force Update that can be used to skip updating all runtime info if that info is the same as the Master Component or force the updating of the runtime info. This only applies to the registration process as that can be serialized, at which point it will need to refresh all runtime data.

Please see Master Pose Component for more information.

Miscellaneous Improvements and Updates

  • Live Animation Blueprint Recompilation is now non-experimental
  • Local Space is now the default Coordinate Space for Animation Editors
  • A notification is now displayed in the Animation Tools viewport when a min LOD is being applied.

New: Gauntlet Automation Framework (Early Access)

The new early access Gauntlet automation framework enables you to automate the process of deploying builds to devices, running one or more clients and or/servers, and processing the results.

You can create Gauntlet scripts that automatically profile points of interest, validate gameplay logic, check return values from backend APIs, and more! Gauntlet has been battle tested for months in the process of optimizing Fortnite and is a key part of ensuring it runs smoothly on all platforms.

Gauntlet provides new profiling helpers that can record critical performance values between two points in time in order to track missed-Vsyncs, hitches, CPU Time, GPU Time, RHI Time, Draw Calls and more. Gauntlet also provides helper functions to gather these from logs so you can generate warnings, store them in databases, or create trend lines. All of the info captured during the test is available to be output into reports any way you want.

An example of a simple report is shown below:

Each Gauntlet test is a C# script that expresses a simple configuration for your test – how many clients, how many servers, and what parameters to pass. Gauntlet takes care of allocating machines from a pool, deploying and running builds, checking for common errors such as crashes, asserts, or timeouts, and collecting log files and other artifacts.

New: Submix Envelope Follower

Users of the new Unreal Audio Engine can now set an Envelope Follower Delegate on their Submixes allowing amplitude analysis of individual channels for that submix. This will help users power visualizations and Blueprint Events based on the amplitude characteristics of their Submixed audio.

New: Filter Sound Submix Effect

Users of the new Unreal Audio Engine now have the option of adding a multimode filter to their Submixes allowing dynamic filter effects on a single Submix.


New: Sound Submix Effect Reverb Dry Level

The Submix Effect Reverb in the new Unreal Audio Engine now supports Parallel Wet and Dry Levels allowing users to dial in specific Wet/Dry ratios making the Effect usable as an Insert-Style Effect as well as a Send-Style Effect.

New: Optimizations to Source Effect API

The Source Effect API in the new Unreal Audio Engine has been optimized to process a full buffer of audio rather than frame-by-frame. This will allow Source Effects to process more efficiently than before.

New: Linux Defaults to Vulkan Renderer

Linux now uses Vulkan as the default renderer when available. In the event the API cannot be initialized, the Engine will fall back OpenGL without notification.

From the Project Settings, you can use the Target RHIs to add or disable a particular RHI or use command line switches -vulkan and -opengl4 to disable the fallback.

New: Linux Media Player

You can now use the bundled WebMMedia plugin to play back .webm VPX8/9 videos on Linux platforms.

New: Linux Crash Report Client GUI

We’ve added support for the Crash Reporter GUI on Linux so you can help us continue to improve support for Linux platforms. Please submit reports when they occur, even repeated ones! It helps our engineers assess the frequency and learn what circumstances cause the crash to happen.

New: Professional Video I/O Improvements (Early Access)

We continue to make it easier to get video feeds into and out of the Unreal Editor over professional quality SDI video cards. You can now work with the same Unreal Engine Project across multiple computers with different hardware setups, without changing any configuration settings in the Project.

Create a MediaProfile on each machine, and set it up to handle the video card and formats that you need to use on that computer. You can also override the Project’s timecode and genlock sources from the same panel:

When you combine the Media Profile with the new Proxy Media Source and Proxy Media Output Asset types, you can automatically redirect input and output channels between the Project’s media content and the settings in your Media Profile. When you switch to a different Media Profile — for example, on a different computer with a different media card or different wiring setup — the input and output channels from that machine’s hardware are automatically routed through the proxies so that you don’t have to change any content in your Project.

For details, see Using Media Profiles and Proxies .

In addition, this release adds:

  • A dockable Timecode Provider panel (Window > Developer Tools > Timecode Provider) that shows the Unreal Engine’s current timecode and the source that timecode is coming from:
  • Support for 10-bit input, audio I/O and interlaced/PsF inputs.
  • A new Blackmagic Media Player Plugin that supports SDI cards from Blackmagic Design. See the Blackmagic Video I/O Quick Start .

Note: The AJA Media Player and Blackmagic Media Player Plugins are now available through the Marketplace tab in the Epic Games Launcher, instead of being installed automatically with the Unreal Engine. Their source is freely available on GitHub , to give other developers a model of how to develop video I/O plugins on top of the Engine’s Media Core APIs.

New: Geographically Accurate Sun Positioning (Early Access)

In the real world, the sun’s position in the sky depends on the latitude and longitude of the viewer, the date, and the time of day. You can now use the same mathematical equations to govern the sun’s position in your Unreal Engine Level.

This is particularly effective any time you need to simulate the real-world lighting conditions for a specific place on the Earth, such as for a major architectural or construction project. However, this can also be useful for any Level that you want to exhibit realistic sun movements and positioning based on global position and time of day.

For details, see Geographically Accurate Sun Positioning .

New: Static Mesh Processing

We have added several new Static Mesh processing options inside the Unreal Editor. You can now save memory by removing unnecessary UV mappings from your Static Meshes.

In addition, using Python and Blueprint scripts that you run inside the Unreal Editor, you can now:

New: Blueprint Usability Improvements

The Blueprint Graph editor now features “Quick Jump” navigation enhancing the existing bookmark feature by enabling users to save their current location and zoom level in the Blueprint Editor with CTRL + [0-9]. They can then quickly return to that graph at that location and zoom level by pressing SHIFT + [0-9] whenever the Blueprint editor window is open, even when working in a different Blueprint Asset. “Quick Jump” bookmarks persist across Editor sessions, and are local to the user/machine.

Users now have the ability to insert pins before or after a target pin for Sequence nodes via the context menu, rather than only being able to add them onto the end.

Monolithic engine header file exclusion from nativized Blueprint class C++ code is now available as a Project Setting. This can help to reduce the overall code size of the monolithic game EXE file, if code size is an issue. The option can be found at Project Settings->Packaging in the “Advanced” section under the “Blueprint Nativization Method” option. This option is disabled by default to maintain compatibility with existing objects.

New: Improvements to HTML5 Templates

HTML5 projects now use separate HTML, JavaScript, and CSS templates replacing the previous monolithic template file! Custom template files are also supported on a per-project basis:





The build process will automatically pick the project’s path or otherwise fallback to the Engine’s version.

This is based on GitHub PR#4780.

New: HTML5 README Files Updated

The HTML5 README file has been split up into multiple README files based on category:

  • Building UE4 HTML5
    • Get Source Files
    • Compiling Support Programs
    • Compiling UE4 Editor
    • Run UE4 Editor
    • Package The Project For HTML5
    • Test The HTML5 Packaged Project
  • Debugging UE4 HTML5
    • How To Dump The Stack And Print From Cpp
    • BugHunting GLSL
  • Emscripten and UE4
    • EMSDK
    • Emscripten toolchain and Thirdparty libraries
    • UE4 C# scripts
    • Test Build, Checking In, and CIS

New: Improved IPv6 Support

Support for IPv4 and IPv6 has been merged into a single socket subsystem, where previously support for each protocol was isolated to a specific subsystem. This allows platforms that used one of the BSD subsystems to support both IPv4 and IPv6 at the same time, and do it transparently to the calling code.

New: DDoS Detection and Mitigation

DDoS (distributed denial of service) attacks typically hinder game servers by flooding them with so many packets, that they are unable to process all of the packets without locking up and/or drowning out other players’ packets, causing players to time out or to suffer severe packet loss which hinders gameplay.

Typically these attacks use spoofed UDP packets, where the source IP is unverifiable. This optional DDoS detection focuses specifically on this situation, detecting/mitigating DDoS attacks based on configurable thresholds of spoofed UDP packets, which do not originate from an existing, known client connection. This is not a guarantee that servers will be safe from all attacks, since it’s still possible that a heavy attack can overwhelm the hardware or OS running the server.

New: Physics Interface Updates

The Physics Interface has been refactored to support an increased ownership of physics objects at the high level. As a consequence of these changes, we have deprecated the Async Scene which was only recommended for use with APEX Destruction. You can still achieve the same visual results using the Sync Scene.

As a result of these changes, much of the physics related C++ code API has changed. Functionally the API is the same and you should be able to use it very similarly to how you currently use it. We’ve made changes to the Physics Interface with the goal of a) reorganizing dependencies into one controlled place, and b) creating a common model for physics interactions when interacting with Unreal.

Please see 4.21 Physics Technical Notes for more information.

New: Pipeline State Object (PSO) Caching

We now support Pipeline State Object (PSO) Caching on Metal (iOS/Mac), DX12 and Vulkan platforms. PSO caching helps reduce any hitches your project might encounter when a Material requires a new Shader to be compiled. PSO Caching creates a list of all the needed Shaders that are required by the Materials which is then used to help speed up the compiling process of these Shaders when they are first encountered by your project. PSO Caching can be enabled in the Project Settings > Packaging section.

To find out more information about how to setup and use PSO caching in your UE4 project, make sure to check out the PSO Caching documents.

New: Physical Lighting Units Updates

We have improved the workflow and usability for Physical Lighting Units based on feedback provided by the community. As part of these updates, the following changes have been made:

  • All light types now display their units type next to the Intensity value.
  • Directional Lights are now displayed in LUX with increased intensity range.
  • Sky Light intensity is now displayed in cd/m2 with increased intensity range.
  • Post Process Auto-Exposure settings can be expressed in EV-100 for an extended range of scene luminance. This can be enabled via Project Settings.
  • The Pixel Inspector can now display pre-exposure for Scene Color. This can be enabled via Project Settings.
  • HDR (Eye Adaptation) Visualization has been refactored in the following ways:
    • HDR Analysis picture-in-picture display over the current scene view allowing adjustments with instant feedback.
    • Visualization is now expressed in EV100.
    • Pixel Inspector-like feedback has been removed.

For additional information, see Physical Lighting Units .

New: Sequencer Event Track

The Sequencer Event Track has been completely refactored so that Events are now more tightly coupled to Blueprint graphs which makes it a much more familiar user-experience and more robust. By utilizing Blueprints and Interfaces, this offers better control and stability compared to the previous implementation which used struct payloads and anonymous named events.

Please see Event Track Overview and Calling Events through Sequencer for more information.

New: Geometry Cache Track (Experimental)

The new (and experimental) Geometry Cache Track allows you to scrub through a Geometry Cache and render it out with frame accuracy.

Please see Using the Geometry Cache Track for more information.

New: Sequencer Audio Bakedown (Early Access)

You can now bake down the audio into a Master Audio Submix from the Render Movie Settings window. The process of baking audio occurs in a separate render pass and exports the audio in the sequence to a single file when you render a movie.

Please see Render Movie Settings for more information.

New: Sequencer Guide Marks

You can now lay down vertical guide marks on the timeline to use for snapping or identifying key points in your timeline.

Please see Using Frame Markers in Sequencer for more information.

New: Windows Mixed Reality Support

Unreal Engine 4 now natively supports the Windows Mixed Reality (WMR) platform and headsets, such as the HP Mixed Reality headset and the Samsung HMD Odyssey headset. To use our native WMR support, you must be on the April 2018 Windows 10 update or later, and have a supported headset. For more information on how to get up and running, see Windows Mixed Reality Development .



Image courtesy of HP

New: Magic Leap Qualified Developer Release Support

Unreal Engine 4 now supports all the features needed to develop complete applications on Magic Leap’s Lumin-based devices. We support rendering, controller support, gesture recognition, audio input/output, media, and more. For more information on how to be become a developer, please check out https://www.magicleap.com/ .

New: Oculus Avatars

The Oculus Avatar SDK includes an Unreal package to assist developers in implementing first-person hand presence for the Rift and Touch controllers. The package includes avatar hand and body assets that are viewable by other users in social applications. The first-person hand models and third-person hand and body models supported by the Avatar SDK automatically pull the avatar configuration choices the user has made in Oculus Home to provide a consistent sense of identity across applications. For more information, see the Avatar SDK Developer Guide .

New: Round Robin Occlusions

Unreal Engine 4 now supports Round Robin Occlusions. With the newly added vr.RoundRobinOcclusion flag enabled, stereoscopic frames will kick off occlusion queries for one eye per frame using an alternating scheme (i.e. odd frames only kick off queries for the left eye, and even frames only kick off queries for the right). This approach cuts the number of occlusion draw calls per frame by half. In some situations, this improves performance significantly.

New: Platform SDK Upgrades

In every release, we update the Engine to support the latest SDK releases from platform partners.



  • IDE Version the Build farm compiles against
    • Visual Studio: Visual Studio 2017 v15.6.3 toolchain (14.13.26128) and Windows 10 SDK (10.0.12699.0)
      • Minimum supported versions
        • Visual Studio 2017 v15.6
        • Visual Studio 2015 Update 3
    • Xcode: Xcode 9.4
  • Android:
    • Android NDK r14b (New CodeWorks for Android 1r7u1 installer will replace previous CodeWorks on Windows and Mac; Linux will use 1r6u1 plus modifications)
  • HTML5: Emscripten 1.37.19
  • Linux “SDK” (cross-toolchain):
    • v12_clang-6.0.1-centos7
  • Lumin: 0.16.0
  • Steam: 1.39
  • SteamVR: 1.39
  • Oculus Runtime: 1.28
  • Switch:
    • SDK 5.3.0 + optional NEX 4.4.2 (Firmware 5.0.0-4.0)
    • SDK 6.4.0 + optional NEX 4.6.2 (Firmware 6.0.0-5.0)
    • Supported IDE: Visual Studio 2017, Visual Studio 2015
  • PS4:
    • 6.008.001
    • Firmware Version 6.008.021
    • Supported IDE: Visual Studio 2017, Visual Studio 2015
  • Xbox One:
    • XDK: June 2018 QFE-4
    • Firmware Version: June 2018 (version 10.0.17134.4056)
    • Supported IDE: Visual Studio 2017
  • macOS: SDK 10.14
  • iOS: SDK 12
  • tvOS: SDK 12
To view the full list of release notes, visit our forum or documentation pages.

Unreal Engine 4.20 Released!

What’s New

Unreal Engine 4.20 delivers on our promises to give developers the scalable tools they need to succeed. Create a future-focused mobile game, explore the impact of Niagara, breathe life into compelling, believable digital humans, and take advantage of workflow optimizations on all platforms.

You can now build life-like digital characters and believable worlds with unparalleled realism. Take your visual effects to the next level with Unreal Engine’s new Niagara particle editor to add amazing  detail to all aspects of your project. Use the new Digital Humans technology powering the “Meet Mike” and “Siren” demos to raise the bar on realism. With the new Cinematic Depth of Field, you can achieve cinema quality camera effects in real-time.

Unreal Engine empowers you to make things your way by giving you the tools to customize the creation process to your preferred style and workflow. With the new Editor Scripting and Automation Libraries, you can can create completely customized tools and workflows. Make the lives of designers and artists easier by adding new actions to apply to Actors or assets thanks to scripted extensions for Actor and Content Browser context menus.

Battle-tested mobile and console support means you can create once and play on any device to deliver experiences anywhere users want to enjoy them. Epic has rallied around the mobile release of Fortnite to optimize Unreal Engine for mobile game development. We have made tons of performance improvements including implementing both hardware and software occlusion queries to limit the amount of work the hardware needs to do. Proxy LOD is now production-ready and can further reduce the complexity of the geometry that needs to be rendered at any time. 

In addition to all of the updates from Epic, this release includes 165 improvements submitted by the incredible community of Unreal Engine developers on GitHub! Thanks to each of these contributors to Unreal Engine 4.20: 

Adam Moss (adamnv), Akihiro Kayama (kayama-shift), Alan Edwardes (alanedwardes), Alan Liu (PicaroonX), Andrew (XenonicDev), Andrew Haselgrove (Dimpl), Anton Rassadin (Antonrr), arkiruthis, Begounet, Brandon Wilson (Brandon-Wilson), c4tnt, Changmin (cmheo), Christian Loock (Brainshack), Clinton Freeman (freemancw), Daniel Assuncao (dani9bma), David Payne (dwrpayne), Deep Silver Dambuster Studios (DSDambuster), Derek van Vliet (derekvanvliet), Eduard Gelbling (NachtMahr87), frankie-dipietro-epic, Gautier Boëda (Goutye), George Erfesoglou (nonlin), Giovanny Gutiérrez (bakjos), Gregor Gullwi (ggsharkmob), Hannah Gamiel (hgamiel), Hyuk Kim (Hybrid0), Ibraheem Alhashim (ialhashim), Ilya (ill), Jacob Nelson (JacobNelsonGames), Jaden Evanger (cyberblaststudios), Jared Taylor (Vaei), Jesse Yeh (jesseyeh), Jia Li (shrimpy56), Jørgen P. Tjernø (jorgenpt), June Rhodes (hach-que), Junichi Kimura (junkimu), Kalle Hämäläinen (kallehamalainen), kinolaev, Kory Postma (korypostma), krill-o-tron, Kryofenix, Lallapallooza, Layla (aylaylay), Lee Berger (IntegralLee), Leon Rosengarten (lion03), Lirrec, malavon, Marat Radchenko (slonopotamus), Marat Yakupov (moadib), Mathias L. Baumann (Marenz), Matt Hoffman (LordNed), Matthew Davey (reapazor), Maxime Turmel (maxtunel), Michael Allar (Allar), Michael Kösel (TheCodez), Michael Puskas (Mmpuskas), Mikayla Hutchinson (mhutch), mimattr, Mitsuhiro Koga (shiena), Muhammad A.Moniem (mamoniem), nakapon, Nicolas Lebedenco (nlebedenco), Paul Eremeeff (PaulEremeeff), Phillip Baxter (PhilBax), projectgheist, Rama (EverNewJoy), redfeatherplusplus, Rei-halycon, Robert Khalikov (nbjk667), Roman Chehowski (RChehowski), S-Marais, Sam Bonifacio (Acren), Satheesh  (ryanjon2040), Scott Freeman (gsfreema), SculptrVR, Sebastian Aaltonen, Sébastien Rombauts (SRombauts), Seokmin Hong (SeokminHong), Sertaç Ogan (SertacOgan), stephenwhittle, Temaran, Thomas Miller (tmiv), Trond Abusdal (trond), TWIDan, Tyler (tstaples), Usagi Ito (usagi), yama2akira, Yang Xiangyun (pdlogingithub), yehaike, Zachary Burke (error454)

Major Features

New: Optimizations and Improvements for Shipping on Mobile Platforms

Unreal Engine 4.20 brings well over 100 mobile optimizations developed for Fortnite on iOS and Android, marking a major shift for developers in terms of ability to more easily ship games and seamlessly optimize gameplay across platforms. Major enhancements include improved Android debugging, mobile landscape improvements, and occlusion queries on mobile.



Hardware and Software Occlusion Queries on Mobile

Hardware Occlusion Queries are now supported for high-end mobile devices on iOS and Android that support ES 3.1 or Vulkan using the GPU. They are enabled by default for any device that supports them. 

Software Occlusion Queries is an experimental feature that uses the CPU to cull primitive components from the scene. Because it uses a conservative approach, it can be used on any mobile device. 


Left – r.Mobile.AllowSoftwareOcclusion 1, r.SO.VisualizeBuffer 1; Right – Render frozen showing occluded parts

To enable Software Occlusion Queries, follow these steps:

  1. Enable r.Mobile.AllowSoftwareOcclusion 1.
  2. Disable r.AllowOcclusionQueries 0.
  3. Enable any primitive to be an occluder by setting LOD for Occluder Mesh true in the Static Mesh Editor.

You can visualize the results in the Mobile Previewer when using High-End Mobile and then enable r.SO.VisualizeBuffer 1.

Platform Material Stats

Quickly profile and optimize your Materials using the new Platform Stats window inside of the Material Editor! You can now see stats for multiple shader platforms and quality levels. For mobile platforms, we use an offline shader compiler to give more accurate instruction and Texture usage information.

Improved Android Debugging

Iterate and debug on Android without having to repackage the UE4 project! When compiling Android, we now generate a Gradle project file which can be opened in Android Studio. You can place breakpoints in C++ and Java code and use Android Studio to launch a debug session.  You can also make changes to C++ source code and recompile. If you start a new debug session, Android Studio will notice the change and quickly upload the new shared library to your device.

Mobile Landscape Improvements

Make your terrains on mobile more interesting now that you can have unlimited Landscape Material layers on mobile devices! While three is still the best optimized case, any number of Landscape layers are supported, provided there are sufficient Texture Samplers available.

You can now use the Feature Level Switch Material nodes in Landscape Materials enabling you to create a single Landscape Material for all platforms.

1 – Mobile Landscape; 2 – PC Landscape

Miscellaneous Mobile Improvements 

The following improvements were made to ship Fortnite on mobile and brought into Unreal Engine 4.20 to benefit all developers:

  • Minimum Static Mesh LOD per platform
  • Minimum Skeletal Mesh LOD per platform
  • Hardware occlusion improvements
  • HLOD tools and workflow optimizations
  • Audio quality node
  • Audio variation culling
  • Audio downsampling  per platform
  • Audio compression quality per platform
  • Shading model tweaks to better match PC
  • Reflection capture brightness fix
  • Landscape support for four layers
  • Landscape tessellation improvements
  • No memory cost for unused LODs, including:
    • Static Meshes
    • Skeletal Meshes
    • Material quality levels
    • Grass and foliage
    • High detail components and meshes
    • High detail emitters in Cascade
  • Settings based on device memory
  • Material memory reduction
  • Editor scriptability for bulk asset changes
  • Particle component pooling
  • Material parameter collection update cost

New: Optimizations and Improvements for Shipping on Nintendo Switch

We have significantly improved Nintendo Switch development by releasing tons of performance and memory improvements built for Fortnite on Nintendo Switch to all Unreal Engine developers!

  This includes the following:

  • Support for Dynamic Resolution and Temporal Upsampling
  • Low Latency Frame Syncing for Controller Input
  • Significant CPU Rendering Optimizations
  • Improvements to Threading
  • Better Texture Compression
  • Support for Memory Profiling
  • Backbuffer support for 1080p while in docked mode
  • And many other fixes!

New: Proxy LOD Improvements

The new Proxy LOD tool has graduated from “Experimental” to production-ready! This tool provides performance advantages by reducing rendering cost due to poly count, draw calls, and material complexity which results in significant gains when developing for mobile and console platforms. This tool provides an alternative to the third-party package Simplygon and can be used in conjunction with the Level of Detail (LOD) systems in Unreal Engine.

The Proxy LOD tool produces a simpler representation by creating a proxy in the form of a single low-poly parameterized mesh and associated textures that visually approximate a collection of more complex source geometry models. This proxy can then be displayed at runtime when a reduction in model quality is acceptable – for example, when geometry only occupies a small number of pixels on screen.

Note: The Proxy LOD tool is currently only available in Unreal Editor on Windows.

The above image shows the buildings and parking lots in Fortnite Battle Royale constructed using the Proxy LOD tool where both Gap-Filling and Hard-Edge Splitting were in use.

The production-ready version of the Proxy LOD tool has several enhancements over the Experimental version found in 4.19. Particularly, improved user control over the Normals on the Proxy Geometry and the ability to generate much simpler proxies by using  gap-filling to automatically close doors and windows.

Improved Normal Control : Hard Edge Split Normal

The extreme constraints on Fortnite memory usage call for highly efficient uses of LODs. For most proxies, very small base color textures are generated and no Normal map is used, this approach requires the highest possible quality Normals on the proxy mesh itself.  

1 – Hard Edge Angle = 80; 2 – Hard Edge Angle = 0

The above gif shows the effect of hard-edge splitting for vertex normals. The image 2 shows smooth vertex normals, as calculated in the 4.19 Experimental version of the Plugin –  the dark regions near the bottom of the house are indicative of the shortcomings. Compare this with image 1 which shows hard-edge vertex normal splitting with a user-supplied hard-edge cutoff angle.

In addition to the hard-edge cutoff angle, the user may now specify the method used in computing the vertex normal, by selecting between Angle Weighted, Area Weighted, and Equal Weighted.

Gap Filling

For watertight geometry, the Proxy system automatically discards any inaccessible structures (for example, interior walls or furniture within a closed house). For ideal results, source geometry should be constructed or altered with this in mind, but due to game production constraints that isn’t always feasible. To facilitate the generation of efficient Proxy LODs from source geometry that is nearly watertight, the Proxy LOD tool can optionally use the level set-based techniques of dilation and erosion, to close gaps. The intended use case is primarily doors and windows in distant buildings.

1 – Original Mesh; 2 – No Gap Filling; 3 – Gap Filling

The above gif shows the effect of using Gap Filling. All images were constrained to use a fixed small amount of texture space. Image 2 is the result of Proxy LOD on a building without using Gap Filling, in which case the LOD includes the interior of the building (at the cost of unseen triangles and texels). Image 3 is the same building with Gap Filling used to automatically close the doors and windows of the buildings, resulting in fewer total triangles and a better use of the limited texture resource.

New: Cinematic Depth of Field

The new Cinematic Depth of Field (DoF) enables you to achieve your vision of rendering cinema quality scenes in a real-time environment! This new method is designed as a higher-quality replacement for the Circle DoF method and is faster than most other DoF methods, such as Bokeh. With Cinematic DoF, the depth of field effect is cleaner, providing a cinematic appearance with the use of a procedural Bokeh simulation. This new DoF implementation also supports alpha channel, dynamic resolution stability, and includes settings to scale it down for console projects.


1 – Cinematic Depth of Field enabled; 2 – Depth of Field disabled

Cinematic Depth of Field is enabled by default and replaces the current selection for the Circle DoF method in the Camera and Post Process settings.

  • Cinematic DoF supports the following Platforms:
    • D3D11 SM5, D3D12 SM5, Vulkan SM5, PlayStation 4, Xbox One, and Mac.
  • The procedural Bokeh simulation supports the following features:
    • Configuring the number of blades for the Diaphragm.
    • Configuring the curvature of the blades directly with the Lens’ largest aperture (Minimal F-stop).
    • Configurable controls available in the Camera settings of the Post Process Volume, Camera Actor, and Cine Camera Actor.
  • Many customizable scalability settings using r.DOF.* console variables to scale it according to your project needs on hardware with finite resources.

For additional information, please see the Depth of Field documentation.

New: Niagara Visual Effects Editor (Early Access)

The Niagara visual effects (VFX) Editor is now available as an Early Access plugin! Try out an early access version of the all-new visual effects tool that will eventually replace Unreal Cascade. Watch this GDC talk for a deeper dive on the vision for Niagara.

Note: The early access nature of this feature means that we are far enough along in development that we want to share it with our customers and get as much feedback as possible before it becomes a standard UE4 Feature. Early Access does not mean that Niagara is production ready as we still have quite a bit of performance optimization and bug fixing that needs to be done before you can consider using this tool for production. However, we hope that effects developers begin investing in learning Niagara and work with us to make it the best VFX editor that it can be.

For an overview of Niagara, please watch the GDC 2018 presentation Programmable VFX with Unreal Engine’s Niagara and read the Niagara documentation.

Improvements to Effect Design and Creation

Left – Particle system utilizing dynamic input module; Right – Dynamic input module

  • Skeletal Meshes can specify their emission from the surface, being driven by either Material name or a named bone influence region.
  • Specifying default values in Modules has been improved, allowing a wide variety of behaviors from calling functions to using default dynamic inputs.
  • Mesh particles now support Angular Velocity.
  • Beams support has been added to the Ribbon renderer with new corresponding Modules.
  • Dependencies between Modules can now be defined, enabling the user to be informed when they are putting the stack in a bad configuration. Also, users are being given options to auto-fix.
  • Multiple improvements have been made to merging System Emitters and Base Emitters, enhancing overall stability.
  • Modules can now be moved up and down the stack via drag-and-drop. Inherited Modules cannot be moved because doing so complicates merging.
  • Modules can now be enabled/disabled within the stack. This will also work for inheritance.
  • Sequencer and Blueprint support for setting Niagara User Namespace variables has been added.
  • You can drive parameters by custom HLSL Expressions, Dynamic Inputs (graph snippets), links to other variables, or by value.
  • Optionally, particles can now have a Persistent ID, which is guaranteed to be unique for that emitter.
  • Multiple renderers of each type can be applied to an emitter. Each instance can adjust where it gets the values for a given parameter. For example, an emitter could have two sprite renderers, one pulling its position from a particle’s position and the other pulling its position from a particle’s offset position.
  • The Niagara Extras Plugin also contains a debug Material that routes various per-particle parameters to a dialog-like display.
  • Houdini has provided a simple CSV importer to Niagara, enabling demo content for GDC 2018.
  • A wide variety of functionality for Niagara has been added under the Automated Testing system.

Updated User Interface

The Niagara interface has been designed to be make complex effects intuitive to create. It uses a stack metaphor as its primary method of combining pieces of script logic together. Inside of the stack, you will find a Timeline to control aspects of the effect over time, a Parameters Panel for easy access to variables available in the effect, and a Attribute Spreadsheet to quickly find and react to information as the effect is running. 

New Modules

All of Niagara’s Modules have been updated or rewritten to support commonly used behaviors in building effects for games and adhere to a consistent set of coding standards. New UI features have also been added for the Niagara stack that mimic the options developers have with UProperties in C++, enabling inline enable/disable or variable display based on the state of another variable. 

GPU Simulation

Niagara now has support for GPU Simulation when used on DX11, PS4, Xbox One, OpenGL (ES3.1), and Metal platforms. There are plans  for Vulkan and Switch to support GPU Simulation in a future release. Current limitations and known issues with GPU simulation are described below:

  • Full support for Niagara requires the ability to read-back data from the GPU. Currently only our DX11 and PS4 rendering interfaces support this functionality, and OpenGL and Metal are in progress.
  • Collision, Curves, and Curl Noise Fields are supported on the GPU. Meshes, Skinned Meshes, Spline Components, and more specialized data interfaces are not yet supported. The API for GPU shaders to interact with UNiagaraDataInterfaces has been redesigned as well.
  • Sprite and Instanced Static Mesh rendering from particles is supported on GPU simulations. At this time, Light Generation from Particles and Ribbons from Particles do not work on the GPU.
  • Events only work on the CPU and will be undergoing significant changes after Unreal Engine 4.20.

CPU Simulation & Compilation

Niagara CPU Simulation now works on PC, PS4, Xbox One, OpenGL (ES3.1) and Metal. At this time, Vulkan and Switch are not supported.

  • The CPU virtual machine (VM) now compiles its contents to the DDC on a background thread, significantly improving overall compilation speed and team efficiency. Further work is required to make the final and expensive VM optimization step occur in ShaderCompileWorker because it depends on non-thread safe libraries. Compilation dependencies are properly tracked across Modules, clearly identifying when we need to recompile certain parts of the stack.
  • Physics simulation on the CPU should properly model the Physics Material values for friction and restitution (bounciness).
  • Emitters will now simulate in parallel on worker threads.

New: Digital Human Improvements

As part of Epic’s character explorations to develop Digital Humans that started with the Photorealistic Character bust, many rendering improvements have been made to develop realistic believable characters that come to life. 

  While developing these characters, the following rendering improvements have been made for Skin, Eyes, Lighting, and Subsurface Scattering.

  • Added a new Specular model with the Double Beckman Dual Lobe method.
  • Light Transmission using Backscatter for Subsurface Profiles.
  • Better contact shadowing for Subsurface Scattering with Boundary Bleed Color.
  • Short Distance Dynamic Global Illumination through Post Process Materials.
  • Added detail for eyes using a separate normal map for the Iris.

For additional information, see Digital Humans.

New: Rectangular Area Lights

Rectangular Area Lights enable you to make more realistic lighting for environments containing large light sources, such as fluorescent overhead lights, televisions, lit signs, and more! Rectangular Area Lights are accessible from the Modes panel along with the other light types.

  • Currently only supports the Deferred Renderer.
  • Acts mostly like a Point Light, except it has Source Width and Height to control the area emitting light.
  • Static and Stationary mobility shadowing works like an area light source with Moveable dynamic shadowing, currently working more like a point light with no area.

Performance Considerations: 

  • More expensive overall than Point or Spot Lights with the dominant cost being incurred when movable and casting shadows. Shadowing generally has the same cost.
  • Stationary Light mobility or Non-Shadow Casting lights can be double the cost with cost scaling depending on the platform being used. If you’re using Static Lights, the cost is free.

New: Mixed Reality Capture (Early Access)

Create compelling spectating experiences for mixed reality applications using the new Mixed Reality Capture functionality, which makes it easy to composite real players into a virtual play space!

The early access Mixed Reality Capture support has three components: video input, calibration, and in-game compositing.  We have a list of supported webcams and HDMI capture devices that enable you to pull real world green screened video into the Unreal Engine from a variety of sources.  If you have a Vive Tracker or similar tracking device, Mixed Reality Capture can match your camera location to the in-game camera to make shots more dynamic and interesting. Setup and calibration is done through a standalone calibration tool that can be reused across Unreal Engine 4 titles. Once you set up your filming location, you can use it across all applications.

While feature support is in early access, we’re looking forward to getting feedback as we continue to improve the system. More information about Mixed Reality Capture setup can be found in the Mixed Reality Development documentation

New: nDisplay Flexible, Multi-Display Rendering

Effortlessly create video walls for large visualization installations using the new nDisplay system! Automatically launch any number of Unreal Engine instances –  locked firmly together, with deterministic content and frame-accurate time synchronization – across any number of host computers, each instance driving its own projector or monitor display. Use active or passive stereoscopic rendering to enhance the viewer’s sense of immersion in the 3D scene and built-in VRPN support to drive the system from mobile VR controllers. 

For more information, please see the documentation.

New: Submix Audio Recording

In the new audio engine, we’ve added the ability to record Engine output – or any individual Submix’s output – to a *.wav file or SoundWave Asset.

Exporting Submix output to a SoundWave Asset.

Exporting Submix output to a *.wav file.

New: Shared Skeletal Mesh LOD Setting

Set LOD settings once and reuse them across multiple Skeletal Mesh assets using the new LOD Settings asset! Inside the Asset Details panel for a Skeletal Mesh, under LOD Settings, you can now select an LOD Settings asset to use, or you can generate a new asset based on the current settings.

Please see the Sharing LOD Settings section of the Skeletal Mesh Asset Details page for more information.

You can also assign the LOD setting and regenerate LODs from Blueprint using a Blutility. 

New: Streaming GeomCache and Improved Alembic importer (Experimental)

We continue to make stability and performance improvements to the geometry cache system, as noted in the following:

  • Individual vertex animation frames are now compressed using an intra-frame codec based on Huffman encoding. Compressed data is streamed from disk, enabling playback of longer sequence with a low amount of memory overhead. The new implementation is still very experimental and is not ready for use in production
  • The Alembic importer has been changed to iteratively import frames rather than importing all frames in bulk. This should improve the PCA pipeline and overall stability and speed.

New: Scripted Extensions for Actor and Content Browser Context Menu

Easily create in-context tools and workflow enhancements without writing a line of code by extending the context menus for Actors and Content Assets in the Browser using Blueprint Utilities, or Blutilities. 

  • Create a new Blutility using one of the new parent classes – AssetActionUtility (for Content Browser extensions) or ActorActionUtility (for Actor extensions).
  • You can specify what types of Actors or Assets the actions apply to with the GetSupportedClass function.
  • Add logic in events (or functions) with no return value, marking them as “Call In Editor” so they show up in the context menu, and a pop-up dialog will display when the event is triggered to fill in values for any parameters you define on your events

New: Animation Retarget Manager Improvements

Animation Retarget Manager now supports saving and loading of the mapping data, so you can save and reuse mapping data on multiple meshes. You can also quickly save multiple rig data for different animations and reuse them with this feature.

Please see the Retarget Manager page for more information.

New: RigidBody Anim Node Improvements

You can now have movement on simulated bodies when moving the Skeletal Mesh Component around in the world when using ‘Local Space’ simulation, which offers greater stability for your simulation. We have now added some options to look at the linear velocity and acceleration of the component in world space, and apply them (scaled and clamped) to the local space simulation. 

We also added the option for any joint to be the base of simulation, and added support for dynamics to easily be reset.

New: Clothing Improvements

Physics Assets now support tapered capsules for collision in clothing simulation.

Note:These are not supported for collisions in rigid body simulations. 

You can also now copy Skeletal Mesh vertex colors to any selected Clothing Parameter Mask. 

New: Garbage Collection Improvements

Garbage collection performance has been optimized reducing some operations by as much as 13x! Specifically, we made the following improvements:

  • The “Mark” phase has been optimized and is now multithreaded. On machines with multiple cores, the cost of marking Objects as unreachable has been reduced from 8 ms to 0.6 ms for approximately 500,000 Objects.
  • The “BeginDestroy” phase (unhashing Objects) now runs across multiple frames, using no more than 2 ms per frame. The cost of unhashing Objects will no longer be included in the same frame as the “Mark” phase and reachability analysis.
  • Garbage Collection assumption verification, which runs in development builds, now uses the same multithreaded code as reference-gathering. As a result, development builds will see an improvement in Garbage Collection times. In Epic’s tests, sample timings for about 500,000 Objects reduced from over 320 ms to under 80 ms.

New: Visual Studio 2017 

UE4 now uses the Visual Studio 2017 compiler, and the Engine will generate project files for Visual Studio 2017 by default. Visual Studio 2015 is still being supported, but requires some configuration. Additionally, we’ve added support for the Windows 10 SDK. 

Note: Visual Studio 2017 supports the installation of multiple compiler versions side-by-side.

See our Hardware & Software Specifications for more information.

New: Development Streams on GitHub

Unreal Engine development streams are now updated live on GitHub. If you want the latest version of development code, you can now pull these streams directly, without waiting for Epic to merge changes from the development teams into our main branch. Note that these streams are live, and have not been vetted by our QA team, which is typically the case in our binary releases or in the main branch. 

To learn more, check out our blog post.

New: UMG Safe Zone Improvements 

The Screen Sizes you select in UMG and Play-In-Editor (PIE) settings are now linked with Device Profiles, which also takes into account the Mobile Content Scale Factor, meaning that the final resolution and DPI scale will change based on the device screen size selected. 


The following improvements have been made for UMG Safe Zone workflow:

  • Safe Zone previewing is now automatically enabled for Debug Title Safe Zone when using a value less than 1 to test screen sizes for TVs and Monitors.
  • Using the command r.MobileContentScaleFactor works to scale phone and tablet resolutions in UMG previews and PIE modes.
  • Non-Uniform safe zones are now supported for devices like the iPhoneX, where parts of the screen are inaccessible.
  • Safe Zones, Scale Boxes, and Common Border Widgets react correctly to non-uniform safe zones and UMG Designer sizes.
  • UMG now displays the selected device, its screen size, and uniform scaling factor for easy reference in the Designer Graph.
  • Use r.MobileContentScaleFactor to scale phone and tablet resolutions in UMG and PIE modes.

For additional information, see UMG Safe Zones.

New: Curve Atlases in Materials

Materials can now use a Curve Atlas asset to store and access linear color curve data with additional support provided through Blueprint. The Curve Atlas uses the same linear curve color as before, except you can use as many linear color curves as the size of your specified Atlas.  

To create a new Curve Atlas, use the Content Browser to select Add New > Miscellaneous and select Curve Atlas.

When you open a Curve Asset Editor, you’ll be able to adjust the Hue, Saturation, Brightness, Vibrance, and Alpha clamps of any individual curve. Additionally, the Preview thumbnails in the Content Browser will display the gradient set by the curve.

For additional information, see Curve Atlases in Materials.

New: Mesh Description Mesh Format 

UE4 is moving to a new higher-level intermediate format which can represent any type of mesh asset in the Engine. This is a gradual process that will improve workflow and enable us to provide some great new features. 

The goal of moving to a new mesh format is:

  • All meshes (Static, Skeletal, and potential other mesh-like objects such as terrain and BSP) can have the same internal representation with some interchangeability, to a certain degree.
  • Most UE4 geometry tools will work on any type of mesh based on the geometry format.
  • Any mesh using the new format can be examined and modified using a standard API enabling runtime, native or scripted modification, opening up many possibilities for procedurally generated content.
  • Meshes will be imported directly to the format with the ability to preserve higher-level mesh representations, such as quads or edge hardness. Currently, these are lost when importing a Static or Skeletal Mesh.
  • The new mesh format is structured internally so that modifications can be made in real-time, even to the most complicated meshes. This forms the basis of a work-in-progress mesh editing feature, which is also scriptable, that will be developed for a future release.

In this release, only Static Mesh has been converted to use the new mesh format. Users will not notice any difference to their everyday workflow and the assets themselves will not change. Currently, the new data is automatically created from the old format and cached in the DDC.

New: Label Saved Colors in Color Picker

Colors saved in your Theme Bar or Theme Menu can now have labels for identification purposes! Labels can easily be set by right-clicking the saved color swatch and entering a name for the saved color.

For additional information, see Color Picker.

New: Recently Opened Filter in Content Browser

Quickly find recently viewed Assets in the Content Browser using the new Recently Opened filter! This filter lists the 20 most recently opened assets.


You can find the Recently Opened filter in the Filters list under Other Filters. You can change the number of recently opened assets listed in Editor Preferences > Content Browser with Number of Assets to Keep in the Recently Opened Filter.

For additional information, see Content Browser Filters.

New: Shotgun Integration (Early Access)

Streamline your production pipeline using the new Shotgun integration for Unreal Engine 4! 

Features include: 

  • It adds the Unreal Editor to your Shotgun launcher, so artists can reliably open the right version of Unreal for the Shotgun project.
  • You can open the Shotgun panel in the Unreal Editor interface, so you can stay up to date with the activity in the Shotgun project as you work.
  • It hooks into the Shotgun loader, so you can easily bring assets into your Unreal Project, and control where they end up in your Content Browser.
  • It even adds Shotgun interaction commands to the contextual menus you get when you right-click Actors in a Level, or assets in the Content Browser.

Note: We’re working out the last details before we can share our integration on GitHub. Check back soon for updates and documentation!

New: Editor Scripting and Automation Libraries

The Editor Scripting Utilities Plugin is now available to all Unreal Engine users. This Plugin offers simplified interfaces for scripting and automating the Unreal Editor, working with assets in the Content Browser, working with Actors in the current Level, editing the properties of Static Mesh assets, and more.

For details, see Scripting and Automating the Editor.

New: Import Asset Metadata through FBX

When you import an FBX file into Unreal, any FbxProperty data that is saved in that file is now imported as well. You can access this metadata in Blueprint or Python scripts that you run in the Unreal Editor. This can help you customize your own asset management pipelines for Unreal based on information about your assets that comes from your content creation tools. 

For details, see FBX Asset Metadata Pipeline.

New: Improved Script Access to Static Meshes for LODs and Collisions 

Blueprint and Python scripts that you run in the Unreal Editor can now modify more properties of your Static Mesh assets. This allows you to automate some of the tools offered by the user interface of the Static Mesh Editor. For example:

New: Blueprint Bookmarks

The Blueprint Bookmarks feature provides the ability to create named Bookmarks in any function graph in the Blueprint Editor. Bookmarks being created will be listed in a new UI window, where you can click them to restore the position and zoom level of the Viewport (as well as the active tab you were viewing). In addition to the Bookmarks you create, you can also quickly jump to any Comment node in your Blueprint by selecting the comment from a separate list. Bookmarks are stored locally on your machine, so they won’t affect the Blueprints themselves, and syncing content will not overwrite your Bookmarks with those of another user.

New: Blueprint Watch Window

The Blueprint Watch Window is designed to speed up debugging by giving you access to the variables and nodes that you want to watch, even across multiple Blueprints. Watch data from every Blueprint that you open in the Editor, and that is part of the current call stack, will be consolidated into a single list, enabling you to inspect variables and function outputs. Also, you can jump between Blueprints with ease. You can click on an entry in the “Node Name” column to go to the named node in any Blueprint, while selecting entries in the “Object Name” column will select the instance of the object associated with that entry. Arrays, Sets, Maps, and other data structures can be expanded, making a drill-down examination of any data they contain quick and convenient.

New: Navigation System Code Moved to a Module

Most Navigation System-related code has been moved out of the Engine code and into a new Navigation System Module. Game-specific code using navigation system functionality might need to be updated.

A Python (3.5) script is available to parse your project’s source code and point out lines that need updating. Optionally, the script can perform the changes but make sure to use this option with caution and assisted by a version control system. Script options can be found at the top of the file.

Please see the Programming Upgrade Notes section for details on upgrading your project to work with these changes.

New: Improved Mobile Specular Lighting Model

Mobile specular response has been changed to use the GGX Lighting Model by default. This improves mobile specular quality and better matches SM5 but adds a small cost to shader processing time. 

1 – 4.20 Default GGX Specular; 2 – 4.19 Spherical Gaussian Specular

The previous Spherical Gaussian Specular model is still accessible via the ‘Use legacy shading mode’ project option and can be found under Rendering > Mobile.  

New: Mobile Skylight Reflections

The Mobile Renderer now uses a Skylight Cubemap for Specular Reflections when no Reflection Captures are relevant.

1 – Mobile, no reflection captures ; 2 – PC, no reflection captures

New: Replication Driver / Replication Graph

The Replication Graph Plugin provides a replication system optimized for games with large Actor and player counts. The system works by building a series of customized nodes that can centralize data and computation. These nodes persist across multiple frames and can be shared by client connections, cutting down on redundant CPU work and enabling Actors to be grouped together in nodes based on game-specific update rules. We may make changes to the API, so this is considered Experimental in 4.20, but it is in use in Fortnite Battle Royale and it will be a fully supported feature. 

New: Steam Authentication

Steam Authentication has been added! Games can now add a packet handler component that interfaces with Steam’s authentication APIs, enabling them to advertise their servers properly, handle VAC/publisher bans, and provide better validation of clients. If enabled, clients joining a server now have to be authenticated by Steam before being allowed into gameplay. By default, clients who fail authentication are kicked from the server.

New: Frame Accuracy Improvements for Sequencer

Sequencer now stores all internal time data as integers, enabling robust support of frame-accuracy in situations where it is a necessity. Keys, section bounds, and other data are now always locked to the underlying user-controllable sequence resolution; this can be as fine or as coarse as the use-case demands. Very high resolutions will support greater fidelity of key placement and sub-frames, while reducing overall sequence range.

Key Updates:

  • The time cursor in Sequencer is now represented as a block that spans the full range of the currently evaluated Tick, showing very clearly which keys are evaluated and which are not for any given frame.
  • “Force Fixed Frame Interval” playback has been rebranded as “Frame Locked”, setting the Engine max FPS to the Sequence’s display rate, and locking time to whole frame numbers (no sub-frame interpolation)
  • Sub frame evaluation remains fully supported for situations where frame accuracy is not a consideration (such as UMG animation).
  • Various time sources are now supported for runtime evaluation such as the Engine clock (supporting world-pause), audio clock and platform clock.
  • The UI can now be viewed in Non Drop Frame (NDF) Timecode and Drop Frame (DF) Timecode. NDF Timecode is available to all frame rates and directly converts the frame number to hours, minutes, seconds, and remaining frames. DF Timecode is only supported on NTSC Rates (23.976, 29.97, 59.94). The display format can be changed with the Ctrl + T keyboard combination or with the framerate UI menu.

Please see the new Sequencer Time Refactor Notes page for more information.

New: Media Track for Sequencer

Sequencer has a new track for playing media sources. It is like the audio track, but for movies. Simply drag-and-drop a Media Source asset into the track view or create a Media Track from the Add Track menu. This feature currently works best with Image Sequences, especially EXR. Image Sequences in the Media Track will accurately sync frames with rendered output.

Please see the Using Media Tracks page for more information.

New: Sequencer Curve Editor and Evaluation Enhancements

Several enhancements have been made to the Curve Editor and Evaluation in Sequencer including:

Weighted tangents are now supported on float curves.

Using weighted curves in the sequencer curve editor

Added support for continuous Euler Angle changes when changing rotations. Euler angles are no longer limited to -180,180, which is necessary to avoid flips in animation.

You can now turn on Quaternion Rotation on a 3D Transform Section via the track’s Properties menu to utilize quaternion interpolation to smoothly interpolate between two rotations. This is similar to the feature previously available in Matinee.

New: Animating Variables on Anim Instances in Sequencer 

It is now possible to animate variables on Anim Instances through possessables, enabling direct control of Anim Blueprint variables, functions and other content. To add an Anim Instance binding to Sequencer, look for its name in the  [+Track] button for Skeletal Animation Components. Any variables that are exposed to cinematics will be shown on its track picker.

Please see the Controlling Anim Instances with Sequencer page for more information.

New: Final Cut Pro 7 XML Import/Export in Sequencer

Sequencer movie scene data can now be exported to and imported from the Final Cut Pro 7 XML format. This can be use to roundtrip data to Adobe Premiere Pro and other editing software that supports FCP 7 XML. You can trim and offset shots in editing software and map those back to sequencer automatically during import.

Note: Audio is not supported at this time.

New: Sequence Recorder Improvements 

Sequence Recorder now supports a profile system that is stored in the Persistent Level. Recording profiles enable you to store which actors you wish to record and their settings, as well as the output path to store the recorded data in. Sequence Recorder also now supports recording multiple takes for each of the selected actors.

Please see the Sequence Recorder page for more information.

New: Sequencer Track Usability Improvements 

Several updates have been made to improve the usability of Tracks within Sequencer. Tracks, Actors and Folders can now be reordered, Event Track names are displayed next to the event keyframe, you can now resize sections to their source duration, you can mask individual transform channels, create Pose Assets from the blended pose and more.

Please see the new Working with Tracks in Sequencer page for more information.

New: Translucency Support for Instanced Stereo Rendering

We’ve taken the improvements to the Instance Stereo Rendering (ISR) path that we made for Robo Recall, and improved them to work across more features in the engine. Unreal Engine 4.20 adds support for performing the translucency rendering pass using Instanced Stereo Rendering, which can significantly reduce CPU cost on translucency-heavy scenes. No content changes are needed; any project with Instanced Stereo enabled in the project settings will automatically get the benefits of Instanced Stereo Rendering.

New: Magic Leap One™ Early Access Support

At GDC, we announced Early Access support for Magic Leap One™: Creator Edition, a software toolkit for early development of experiences for Magic Leap’s personal spatial computing platform, as part of a larger partnership between the two companies. As of Unreal Engine 4.20, you can develop for the Magic Leap One™ using the fully supported release of Unreal Engine. 

Unreal Engine 4 support for Magic Leap One uses our built in frameworks for things like camera control, world meshing, motion controllers, and forward and deferred rendering. We’ve also added more robust support for features like eye tracking and gestures.

Developers can download the Magic Leap software development kit and simulator at developer.magicleap.com.  For those developers with access to hardware, Unreal Engine 4.20 can deploy and run on the device in addition to supporting Zero Iteration workflows through Play In Editor. 

New: Apple ARKit 2.0 Support

We’ve added support for Apple’s ARKit 2.0, which includes better tracking quality, support for vertical plane detection, face tracking, 2D image detection, 3D object detection, persistent AR experiences and shared AR experiences. Support for these new features enables you to place AR objects on more surfaces, track the position and orientation of a face, recognize and bring 2D images to life, detect 3D objects, and facilitate new types of collaborative AR experiences.

New: Google ARCore 1.2 Support 

We’ve added support for Google’s ARCore 1.2, which includes support for vertical plane detection, Augmented Images, and Cloud Anchors. Support for these new features enables you to place AR objects on more surfaces, recognize and bring images to life, and facilitate new types of collaborative AR experiences.

New: Platform SDK Upgrades

In every release, we update the Engine to support the latest SDK releases from platform partners. 


  • IDE Version the Build farm compiles against
    • Visual Studio:  Visual Studio 2017 v15.6.3 toolchain (14.13.26128) and Windows 10 SDK (10.0.12699.0)
      • Minimum supported versions
        • Visual Studio 2017 v15.6
        • Visual Studio 2015 Update 3
    • Xcode:  Xcode 9.4
  • Android:  
    • NDK 12b (New CodeWorks for Android 1r6u1 installer will replace previous CodeWorks for Android 1R5 before release, still on NDK 12b)
  • HTML5: Emscripten 1.37.19
  • LInux: v11_clang-5.0.0-centos7
  • Lumin: 0.12.0
  • Steam: 1.39
  • SteamVR: 1.39
  • Oculus Runtime: 1.25
  • Switch:
    • SDK 4.5.0 + optional NEX 4.2.1 (Firmware 4.1.0-1.0)
    • SDK 5.3.0 + optional NEX 4.4.2 (Firmware 5.0.0-4.0)
    • Supported IDE: VS 2015 / 2017
  • PS4:
    • 5.508.031
    • Firmware Version 5.530.011
    • Supported IDE: Visual Studio 2015, Visual Studio 2017
  • Xbox One (XB1, XB1-S, XB!-X):
    • XDK: April 2018
    • Firmware Version: April 2018 (version 10.0.17133.2020)
    • Supported IDE: Visual Studio 2017
  • macOS: SDK 10.13
  • iOS: SDK 11
  • tvOS: SDK 11

To view the full list of release notes, visit our forum or docs pages.