Unreal Engine 4.19 enables you to step inside the creative process – the tools become almost transparent so you can spend more time creating. Improvements to rendering, physics, Landscape terrain, and many more systems mean you can build worlds that run faster than ever before. The quality of life for the developers using our tools is always top of mind, so we continue to look at areas we can improve to put the power into the developers’ hands.
Whether you are creating games, linear media, architectural visualizations, or product design tools, Unreal Engine 4.19 enables you to know exactly what the finished product will look like every step of the way. The new Live Link plugin seamlessly blends workflows between external content creation tools and Unreal Engine so you can see updates as you make changes to source content. And with the continued improvements to Sequencer, you can be the director with even more control of your scenes in real time.
When it comes to bringing the worlds in your imagination to life, the sky's the limit. Create breathtaking vistas in large, open worlds thanks to Landscape rendering optimizations. With the new Dynamic Resolution feature that adjusts the resolution as needed to achieve desired frame rates, those worlds will run smoother than ever before on PlayStation 4 and Xbox One.
It wouldn’t be a complete release without a mountain of workflow and usability improvements, and Unreal Engine 4.19 does not disappoint in this respect. Working with Material layers and parameters is easier and more intuitive. Features for debugging Blueprints are more powerful with the ability to step over, step into, and step out. You can now save content folders as favorites. Animation tools have been improved with pinnable commands, ability to have multiple viewports, and lots more.
In addition to all of the updates from Epic, this release includes 128 improvements submitted by the incredible community of Unreal Engine developers on GitHub! Thanks to each of these contributors to Unreal Engine 4.19:
Adam Rehn (adamrehn), Alex Chi (alexchicn), Alexander Stevens (MilkyEngineer), alexformosoc, Andrew Armbruster (aarmbruster), Anton Rassadin (Antonrr) Austin Pukasamsombut (AustinPuk-Conffx), Bo Anderson (Bo98), Carlos (carloshellin), Carsten Neumann (cneumann), Celso Dantas (celsodantas), Chris Conway (Koderz), Chris Dietsch (cdietschrun), Chrispykins, Clinton Freeman (freemancw), cmheo, CuoXia (shrimpy56), Daniel Butum (leyyin), Daniel Neel (dneelyep), Daniel Sell (djsell), David Payne (dwrpayne), DavidNSilva, Deep Silver Dambuster Studios (DSDambuster), Derek van Vliet (derekvanvliet), DerelictHelmsman, dfb, Dorgon Chang (dorgonman), druhasu, Eddy Luten (EddyLuten), foollbar, GeorgeR, Giperionn, Hyuk Kim (Hybrid0), Ilya (ill), Jack Knobel (jackknobel), Jacob Nelson (JacobNelsonGames), James Horsley (mmdanggg2), Jared Taylor (Vaei), Javad Shafique (JavadSM), jessicafalk, John W. Ratcliff (jratcliff63367), Jørgen P. Tjernø (jorgenpt), Josef Gluyas (Josef-CL), Julien Le Corre (jlc-innerspace), konflictue, Kryofenix, Kyle Langley (Vawx), Marcin Gorzel (mgorzel), Matt Hoffman (LordNed), Matthew Davey (reapazor), Michael Allar (Allar), Michał Nowak (BibbitM), Mikael Hermansson (mihe), mkirzinger, Mustafa Top (MSTF), nakapon, nullbus, pfoote, phlknght, projectgheist, Richard Schubert (Hemofektik), Ron Radeztsky, rlefebvre, ruffenman, sangpan, Sébastien Rombauts (SRombauts), smt923, Stefan Zimecki (stefanzimecki), Stephen Swires (sswires), Tim Niederhausen (timniederhausen), Timothee Besset (TTimo), Tom Ward (tomwardio), tomari, TomFors, Toru Hisai (torus), Vikram Saran (vikhik), yaakuro, Yohann Martel (ymartel06), YuchenMei, Zeno Ahn (zenoengine)
New: Temporal Upsampling
We have added a new upscaling method called Temporal Upsample that performs both a temporal accumulation of the frame at lower resolution and primary spatial upscale, reducing output blur.
1 – Temporal Upsample Enabled; 2 – Temporal Upsample Disabled
In previous versions, the idea of Screen Percentage enabled you to render the 3D scene at lower resolution, and spatially upscale it to the output resolution, before the UI gets drawn on top. This was a very flexible method to hit a GPU budget on less powerful hardware, but it required you to make a tradeoff between output sharpness and GPU performance.
To replace that single screen percentage, we now offer two separate screen percentages used for upscaling:
- Primary screen percentage that by default will use the spatial upscale pass as before;
- Secondary screen percentage that is a static, spatial only upscale at the very end of post processing, before the UI draws.
The temporal upscaler that happens in the Temporal Anti-Aliasing (TAA) pass enables consistent geometry sharpness from varying primary screen percentages from 50% up to 200%. In effect, even though the screen percentage may be lowered, with TAAU enabled, geometry in the background that would usually become muddy or blend together can now maintain its detail and complexity, like the fence and telephone pole in the example above.
For additional information, see the Screen Percentage with Temporal Upsample page.
New: Dynamic Resolution
We now have support for Dynamic Resolution, which adjusts the resolution as needed to achieve a desired framerate, for games on PlayStation 4 and Xbox One! This works by using a heuristic to set the primary screen percentage based on the previous frames GPU workload.
Support for additional platforms will be coming in a future release. For additional information, see the Dynamic Resolution page.
New: Unified Unreal AR Framework
With this release of Unreal Engine, the Unified Unreal Augmented Reality Framework provides a rich, unified framework for building Augmented Reality (AR) apps for both Apple and Google handheld platforms. The framework provides a single path of development, allowing developers to build AR apps for both platforms using a single code path. The Unified Unreal AR Framework includes functions supporting Alignment, Light Estimation, Pinning, Session State, Trace Results, and Tracking.
Courtesy of Theia
New: Unified Unreal AR Framework Project Template
Also new is the Blueprint template HandheldAR, which provides a complete example project demonstrating the new functionality.
New: Physical Light Units
New to Unreal Engine 4.19, all light units are now defined using physically based units. Some light units were already well defined, but others were using undefined, engine specific units. The unit selection of a light is done through a drop-down menu (where applicable). For compatibility reasons, the default light units are kept compatible with previous versions of the engine. The new light unit property can be edited per light, changing how the engine interprets the “Intensity” property when doing lighting related computations.
Project courtesy of Litrix
New: Live Link Plugin Improvements
The Maya Live Link Plugin is now available and can be used to establish a connection between Maya and UE4 enabling you to preview changes made in Maya in real-time inside the UE4 Editor.
Once enabled, you will need to copy the binary files associated with your version of Maya to your Maya Plugins folder and enable it through the Maya Plugins Manager.
You can find pre-built binaries of the Maya Live Link Plugin inside your Engine installation folder in the following location: Engine\Extras\MayaLiveLink\LiveLinkMaya.zip. Inside the zip file are binaries for Maya 2016/2017 and 2018 for Windows. If you require binaries for other versions, the source code for the Plugin can be found in the Engine\Source\Programs\MayaLiveLinkPlugin folder which can be used to build the binaries.
The Maya Live Link UI window can be enabled through the MEL console with the command MayaLiveLinkU. At the top right is a display that shows whether this instance of Maya is connected to an Unreal client. Below Unreal Engine Live Link is a list of all the subjects currently being streamed (in the image above, only one is being streamed) and in the lower window, controls for adding and removing subjects from streaming is available.
Motionbuilder Live Link Plugin
The Motionbuilder Plugin offers the same functionality as the Maya Plugin and shows up in the Editor as a connection in a similar way. It also has a custom UI for managing streaming:
Objects can be selected from the the current scene and added to the streamed list (as shown above). From there, their names can be set in the Subject Name column and their Stream Type (Camera, Skeleton, etc.) can be set. Streaming on the subject can also be enabled and disabled from here.
Stream Active Camera as Editor Active Camera
Controlling the active camera within Maya will now manipulate the Editor’s active camera. We’ve also reworked the Editor update hook so that streaming is more robust and facial rigs/leaf bones now correctly update.
Added Virtual Subjects to Live Link
Virtual subjects are a way to combine multiple subjects coming into Live Link into one subject that can then be used by the Editor. Virtual Subjects are created within the client and contain the bones of multiple real subjects, all tied to a common root.
Below is the virtual subject being applied to a skeletal mesh in the editor.
Here we have created a new virtual subject (using Add > Add Virtual Subject) and set it to read subjects from the Maya Live Link source. The virtual subject is set to consist of the three Item subjects being sent from Maya.
Live Link Plugin Development
The purpose of Live Link is to provide a common interface for streaming and consuming any kind of animation data from sources outside of UE4 (for example, DDC tools and Motion Capture servers). It is designed to be extensible via Unreal Plugins, allowing third parties to develop new features with no need to make and then maintain Engine changes.
There are two paths for integrating in Live Link:
- Building an Unreal Engine Plugin that exposes a new Source to Live Link. This is the recommended approach for anyone that already has their own streaming protocol.
- Integrating a Message Bus end-point in third party software to allow it to act as a data transmitter for the built-in Message Bus Source. This is the approach we have taken for our Maya and MotionBuilder Plugins.
For an overview of general Plugin development, see the Plugins documentation pages.
Live Link Motion Controller Support
Live Link can now be used with Motion Controllers. The motion source property of a Motion Controller can be set to a subject within Live Link. When set in this way, the position of the Motion Controller Component is governed by the first transform of the subject.
The Motion Controller integration can also access custom parameters on the Live Link subject. These are passed via the curve support built into Live Link subjects.
To access the values, it is necessary to derive a new Blueprint from MotionControllerComponent and override the OnMotionControllerUpdated function. During OnMotionControllerUpdated, it is valid to call GetParameterValue on the Motion Controller.
Here is an example of a possible way to drive a light components Intensity from an Intensity parameter in Live Link:
- Added ability for Live Link Sources to define their own custom settings – This was requested by partners like IKinema building Live Link support
- Live Link Sources pushing skeletons to Live Link Client can now pass source GUID as well – Message bus source now pushes GUID when sending skeleton.
- Added virtual Initialization function and Update DeltaTime parameter to Live Link Retargeter API
New: Sequencer Improvements
We continue to make improvements to the functionality and workflow of Sequencer to make it more powerful and increase efficiency.
Copy/Paste/Duplicate Object Tracks
You can now copy/paste/duplicate object tracks and their child tracks from the right-click context menu. You can copy a spawnable to the same Level Sequence or another Sequence or a possessable from one Level Sequence to another and the object will be bound to the same object in the other Level Sequence. You can now also copy from one UMG animation to another.
Level Sequence Dynamic Transform Origin
You can now offset the Actors controlled by Sequence Transform tracks with a global offset at runtime. This allows you to also reuse a Level Sequence in different coordinate spaces. To use this feature, inside the Details panel for your Level Sequence, enable Override Instance Data then assign a Transform Origin Actor.
The Transform Origin section specifies a transform from which all absolute transform sections inside the Sequence should be added to (Scale is ignored). For the best results, keyframe your Actor’s starting transform at 0,0,0 inside your Level Sequence and let the Transform Origin Actor you define drive the position from which to start in the world (depicted below).
Whenever we press a key (see above), the cube moves and our Actor continues to walk along a path, starting from the location of the Transform Origin Actor.
Sequencer Anim BP Weight Control
Sequencer weight blending now works with Animation Blueprints. You can use the same slot for the animation, and control weight by curve. Assign the same Slot Name under Properties for each animation to blend. Then keyframe the Weight values you desire. Inside your Anim BP, use the Slot Node. The AnimBP will use whatever Weight values you provided when blending.
Improved Sequencer Editor Performance
- Compile On The Fly Logic – Sequencer is now able to compile partially or completely out-of-date evaluation templates from the source data as it needs. This affords much more efficient compilation when working within Sequencer in the editor.
- Details Panel Updates – The details panel now defers updates while scrubbing/playing in the Sequencer Editor.
- New .ini setting for the default behavior of “When Finished”.
- For level sequences, it continues to default to RestoreState. For UMG, it’s now set to Keep State.
- For level sequences, it continues to default to RestoreState. For UMG, it’s now set to Keep State.
- Always create a camera cut track when adding a camera so that there’s less confusion as to why a camera doesn’t take control when sequencer is activated.
- Rotations will no longer flip when autokeying/adding a new key.
- Option to show selected nodes in the tree view only. This can be useful when working with a lot of actors in your level sequence to limit scrolling up and down the view.
- Option to bake transform track to keys at frame intervals.
- Option to create a camera anim from a transform track.
- Ability to bind (path track, camera cut track or attach track) to spawnables in subsequences.
- Can now drag sections up a row, instead of only just down.
New: Landscape Rendering Optimization
The Landscape level of detail (LOD) system has been changed so that instead of being distance-based it now uses screen size to determine detail for a component, similar to how the Static Mesh LOD system works. The LOD distribution now gives more coherent sizes on distant triangles based on their screen size, which can maintain detail that was not possible before.
(Left, old method; Right, new method)
You can visualize the Landscape LODs by going to Viewport > Visualizers > LOD.
The following settings are available to control where LODs transition based on screen size. Additionally, material tessellation can now be controlled, which gives the added performance benefit while using a Directional light with dynamic shadowing enabled.
New: Proxy LOD System (Experimental)
The new Proxy LOD system is an experimental Plugin for producing low poly LOD with baked materials for multiple meshes. The new system is used by HLOD and is a replacement for Simplygon.
Note: Currently, only Windows builds are supported.
To enable the tool, look for the “Hierarchical LOD Mesh Simplification” option under Editor in the Project Settings dialog. From the Hierarchical LOD Mesh Reduction Plugin, you should be able to select the new module “ProxyLODMeshReduction”. After the prompted editor restart, the Plugin will replace the third party Simplygon tool for static mesh merging LODs. This new Plugin is accessed in two ways: The HLOD Outliner, and the Merge Actors dialog.
Source Geometry: 11 geometry pieces, 38,360 tris, and 27,194 verts | Result Geometry Proxy: 1 geometry piece, 8,095 tris, and 5,663 verts
New: Material Parameters Editing and Saving
You can now save parameter values to a new Child Instance or Sibling Instance in the Material Editor and the Material Instance Editor!
- Save to Sibling Instance enables you to save the current parameters and overrides to a new Material Instance with the same Parent Material.
- Save to Child Instance enables you to save the current parameters and overrides to a new Material Instance with the current instance as the Parent Material.
The Material Editor now features a Parameter Defaults panel. Here, you’ll have access to the default values set in your Material Graph for any parameter. You can easily change any of your parameter default values here.
New: Material Layering (Experimental)
Material Layering enables you to combine your Materials in a stack, using the new Material Layer and Material Layer Blend assets! This enables you to build the correct Material Graph without building sections of nodes by hand. This functionality is similar to Material Functions, but supports the creation of child instances.
Material Layering is experimental and can be enabled in the Project Settings > Rendering > Experimental section by setting Support Material Layers to true.
For additional information, see our post on the forums to get started and leave us feedback!
New: Upgraded 'Auto Convex' Tool
You can now specify Max Hulls instead of accuracy when generating simplified collision for a Static Mesh using the Auto Convex Tool. Computation is now a background task in the Static Mesh Editor. The Auto Convex Tool also uses a newer version of V-HACD library that should give more accurate results.
New: Improved Facial Animation Sharing using Curve only animations
You can share facial animations between characters by animating a curve to drive a pose without having any bone transform. Any bone transform, even with one mesh’s ref pose, won’t work for other meshes, so it is important to remove them and start with the mesh’s own ref pose if you want to share the curve between different meshes. This allows you to share the facial curves between different faces.
We’ve added the ability to allow users to delete all bone tracks, leaving only animation curves. You can access the Remove All Bone Tracks option from the Asset menu under Animation.
You can also use Animation Modifier to remove certain joints only if you want to share body animations but drive curves with face. Added the functions Remove Bone Animation to remove an Animation Curve by Name from the given Animation Sequence (Raw Animation Curves [Names] may not be removed from the Skeleton) and Remove All Bone Animation to remove all Animation Bone Track Data from the given animation Sequence.
We’ve also allowed the opting out of importing bone tracks (curve only animation) during the import process.
We’ve fixed Pose Preview to select weights that the asset contains. You can see what bone tracks the animation contains inside the Asset Details panel under Animation Track Names.
New: Cloth Updates
The new “Anim Drive” feature added to clothing assets and runtime interaction for simulations adds a new mask target for “Anim Drive Multipliers” that is used to drive springs that pull clothing towards its animated position, giving control to animators in cinematics or animation driven scenes.
This can be driven at runtime using the “Interactor” object that Blueprints can access on skeletal mesh components. The value set at runtime is multiplicatively combined with the painted values to get the final spring and damper strengths.
Other improvements include:
- Gravity override support for clothing simulations – Using the interactor object on a skeletal mesh component – an arbitrary gravity override can be specified.
- Added auto-range feature to cloth view ranges – Also added a way for tools to extend those ranges when necessary
- Improved Clothing Visualizations – Can now be enabled when Clothing Paint Mode is active.
- Clothing Create Menu now defaults to use the Skeletal Mesh Physics Asset
New: Animation Blueprint Improvements
We have made several quality of life improvements to the Animation Blueprints system:
- You now have the ability to control whether or not the post process instance runs on a Skeletal Mesh Component from both Blueprints and the Animation Editor suite
- Copy Pose node now supports copying curves as well as Bone transform – When using this, make sure to enable bCopyCurves if you want to copy curves (by default this is false). When enabled, it will copy the curves that exist in the current Skeleton. If the curve list has changed, it will require the animation system to be reinitialized.
- Added additive warning when playing an animation on an additive pose – When a non-additive animation is connected to the additive pin of an additive animation node (such as Apply Additive), we will now output a warning in the message log with the Animation and Blueprint name.
- Native AnimInstance selection on SkeletalMeshComponent – Allow native Animation Instances to be set on Skeletal Mesh Components. Previously, only Anim Blueprints could be set on Skeletal Mesh Components.
New: Animation Tools Improvements
Added Pinnable Command List to animation editors
Shift-clicking a menu option in either the skeleton tree filter or the viewport menus will 'pin' the command outside of the menu. This allows for easier access to commands that are frequently used.
Added up to 4 separate viewports to all Animation Editor viewports – All animation-related editors now have the ability to open up to 4 separate viewports onto the same scene, each with their own settings.
Added the ability to follow and orbit specified bones
Additional Skeletal Mesh Component Debug Info
You can extend the viewport text seen in the Skeletal Mesh Editor. Lines were also added, displaying current Cloth Value.
Ability to change Skeletal Mesh section visibility at runtime
Exposed ShowMaterialSection function to hide/show Skeletal Mesh material sections at runtime to Blueprint. Added ShowAllMaterialSections and IsMaterialSectionShown functions. Removed internal 'disable' and 'highlight' flags from asset: always use per-component flags now.
Added “Hide Unconnected Pins” for button BreakStruct nodes
BreakStruct nodes now have a button to quickly hide unconnected pins
In addition, the Animation Tools improvements include:
- Fixed motion blur when changing skeletal mesh properties – The animation system owns the previous bone transform with revision number, so that it can send to the render thread when RenderState is recreated. Frame Number still exists because the clothing system still uses it. Revision number progresses when the component ticks. If it doesn't tick, it will keep the same revision number (keeps history).
- Exposed Sequence Recording Settings in Animation Editors
- Ability to name collision shapes – You can now name individual collision shapes when editing a Physics Asset.
- Improved bounds following in Animation Editor Viewports – Bounds following now follows without lag.
- Added ability to change the Retarget Source inside a Pose Asset – Pose Assets now save the original Source Pose so that it can recalculate a new pose when Retarget Source changes. Retarget Source is used to bake the pose asset delta transform, so you will need to make sure you have the right Retarget Source if you are using the same asset for different meshes.
- Configurable mesh paint vertex size – Added mesh paint settings to per-user settings, and added preview vertex size, so that it's configurable by the user for use on different-size meshes.
- Changed animation preview tooltips in Persona to stop loading the animation if it is not already loaded
- Preview buttons are now present where applicable in animation editors – If relevant, Preview Mesh, Preview Animation and Reference Pose are all editable via the toolbar.
- Added Search bar to Advanced Preview Scene settings panel
- Added Context Menu to HLOD level nodes in HLOD outliner treeview – This allows creating a new cluster from the current level viewport selection.
- Added a hotkey and command for toggling Post Processing in Preview Scenes
- Added a new shortcut for switching foreground and background colors in the Mesh Painter – By default the shortcut is bound to the ‘X’ Key when the Mesh Painter is active.
- Material Baking has been moved out of the experimental phase
- Added support for RGBA Masks in Mesh Painter – This allows the use of RGBA channels as masks while using the fill tool inside of the Mesh Painter.
- Added 'GetSectionFromProceduralMesh' utility function for ProceduralMeshComponent
- Notifies on follower animations in a sync group – Added flag to allow notifies to be triggered on follower animations in a Sync Group.
- Reduced Logging verbosity when cooking all assets – Only log 2 lines per asset instead of 4.
- Exposed Damping and Stiffness as pins on Spring AnimNode – Spring Anim Node now allows exposing the Spring Damping and Spring Stiffness as pins.
New: Motion Controller Component Visualization
Motion Controllers in Unreal Engine 4.19 now have a new Visualization category that enables you to quickly (and more easily) add a display model Static Mesh to your Motion Controllers. By default, the system attempts to load a Static Mesh model matching the device driving the Motion Controller.
- Motion Controller components now have new visualization fields that offer the following options: Display Device Model, Display Model Source, Custom Display Mesh, and Display Mesh Material Overrides.
- Custom Display Mesh is only available when Display Model Source is set to “Custom”.
- For more complicated, custom models, manually attach your own Static Mesh model.
- ‘Display Model Source’ can be manually set if the target XR system isn’t loaded.
- By default, when empty, ‘Display Model Source’ will attempt to load a model from the backing XR system.
- Distinct XR systems support this feature by implementing the IXRSystemAssets interface – currently, only Oculus and SteamVR support this functionality.
New: Vive Pro Support
All existing UE4 content that supports SteamVR is compatible with HTC’s newly-announced Vive Pro! While no modifications are necessary to use the Vive Pro with your existing UE4 project, you will need to check the performance of your UE4 project with the Vive Pro and optimize accordingly as the Vive Pro operates at a higher resolution.
New: Platform SDK Upgrades
In every release, we update the engine to support the latest SDK releases from platform partners.
- IDE Version the Build farm compiles against
- Visual Studio: Visual Studio 2015 Update 3
- Xcode: Xcode 9.2
- NDK 12b (New CodeWorks for Android 1r6u1 installer will replace previous CodeWorks for Android 1R5 before release, still on NDK 12b)
- HTML5: emscripten toolchain to 1.37.19
- LInux: v11_clang-5.0.0-centos7
- Steam: 1.39
- Oculus Runtime: 1.17
- SteamVR: 1.39
- Switch: 3.7.0 with NEX: 4.3.1 (optional), 4.5.0 with NEX 4.3.1 (optional); Firmware Version 3.0.1-1.0
- Firmware Version 4.730.001
- Supported IDE: Visual Studio 2015, Visual Studio 2013
- Xbox One (XB1, XB1-S, XB!-X):
- XDK: June 2017 QFE9
- Firmware Version: October 2017 Preview 3 (version 16291.1000.170913-1900)
- Supported IDE: Visual Studio 2017
- macOS: SDK 10.13
- iOS: SDK 11
- tvOS: SDK 11
New: Low Input Latency Frame Syncing (Xbox One and PlayStation 4)
Low-latency frame syncing mode modifies the way thread syncing is performed between the game, rendering and RHI threads, and the GPU to greatly reduce and control input latency. The effects of Low-latency frame syncing can be controlled using the r.GTSyncType (game thread sync type) console variable. This console variable drives how the new frame syncing mechanism works and allows the following values:
- 0 – Game thread syncs with rendering thread (old behavior, and default)
- 1 – Game thread syncs to the RHI thread (equivalent to UE4 before parallel rendering)
- 2 – Game thread syncs with the swap chain present +/- an offset in milliseconds (only available on supported platforms, and only works when vsync is enabled)
For additional information, see the Low Latency Frame Syncing page.
New: Encryption/Signing Key Generator
It is now possible to configure .pak file signing and encryption fully from the Editor via a single unified settings panel. These features can now be controlled in Project Settings, under the Project section, in the Crypto panel.
From this panel, it is possible to generate or clear keys for either signing or encryption, as well as enable or disable the various types of signing and encryption that are available. The following features can be activated or deactivated from this menu:
- Encrypt Pak .ini Files – Encrypts all .ini files that exist in your project’s .pak files. This will prevent easy mining or tampering with configuration data for your product, at minimal runtime cost.
- Encrypt Pak Index – Encrypts the pak file index, which prevents UnrealPak from opening, viewing, and unpacking your product’s pak files, at minimal runtime cost.
- Encrypt UAsset Files – Encrypts the uasset files in the pak file. uasset files contain header information about the assets inside, but not the actual asset data itself. Encrypting this data provides extra security to your data, but does add a small runtime cost and an increase in data entropy which can have a negative impact on patch sizes.
- Encrypt Assets – Fully encrypts all Assets within the pak file. Please note that this setting will have a measurable effect on run-time file I/O performance, and increases the amount of entropy in your final packaged data, making the distribution patching system less efficient.
New: Call Stack Display for Blueprint Debugging
A control that displays the current Blueprint Call Stack has been added to the Developer Tools menu. Double-clicking entries in the Call Stack will focus the corresponding node in the Blueprint Editor.
New: Single stepping improvements to Blueprint Debugging
The Engine’s Blueprint Debugger now supports Step Into (F11), Step Over (F10), and Step Out (Alt+Shift+F11) operations when stopped at a Blueprint Breakpoint, making debugging sessions quicker and easier.
In addition, we have added the ability to break directly on, and step into, over, or out of, macros and collapsed graph nodes:
New: Improved Tools For Optimizing Disk Size
Several tools have been added or improved to help analyze disk and download sizes for packaged games. The Asset Audit window has a new “Add Chunks” option which shows all packaged Chunk/Pak files, and the Reference Viewer and Size Map can be used from that view to analyze why Assets are being cooked into a specific chunk.
The Size Map tool has been changed to show Disk Size by default, which is more accurate than the estimated memory size. If a platform is selected in the Asset Audit window, the actual cooked size for that platform will be shown. The Size Map’s user interface has also been improved to support right-click operations and better performance.
New: Audio Improvements
Upgrade to Steam Audio Beta 10 (New Audio Engine Only)
Along with the update to Steam Audio (Beta 10 Release), improvements include:
- Android support for spatialization and occlusion
- Editor mode for easy access to Steam Audio functionality
- Buttons to add and remove Phonon Geometry Components to all Static Mesh Actors
- Button to export the scene to .obj for debugging
- Removed Phonon Scene actor in favor of export to file
- Probe generation and baking writes data to disk, to avoid slow loading and saving of map files
- Numerous information display improvements
- Numerous bug fixes
Oculus Audio Updates (New Audio Engine Only)
Oculus Audio plugin has been updated with support for Android.
In addition to providing HRTF spatialization, we now support Oculus Audio’s room modeling reverb with volumetric parameters as well as Ambisonics Source playback.
Added Resonance Audio Plugin (New Audio Engine Only)
The Resonance Audio plugin for Unreal is now included as part of the Engine Plugins in UE 4.19. The Resonance Audio Plugin supports binaural spatialization, reverberation, sound directivity, and ambisonics playback on Windows, iOS, Android, Linux, and Mac.
First-Order Ambisonics File Support (Playback supported through plugins)
First-order Ambisonics files can now be imported into Unreal projects and played back using select Spatialization plugins. First-Order Ambisonics is a spherical surround format that offers forward, right, and upward vectors of a sound field allowing designers to create Surround Sound experiences that include vertical sound orientation. This is useful for VR, where users will experience natural changes in vertical sound fields by tilting their heads or for any other interactive experience where user orientation allows changes in the vertical. Users can import 4 channel First-Order Ambisonics files by including the FuMa or AmbiX (the two First-Order Ambisonics channel configurations) tag in their file name. Oculus Audio and Google Resonance both have ambisonics playback implementations.
Improvements to UE4's native VOIP implementation (New Audio Engine Only)
Our native VOIP implementation has been improved to provide better coherency including improved silence detection on Windows. Additionally, we now support bringing VOIP sources into the scene allowing for:
- Distance Attenuation
- Custom Source Effects
All controlled using the UVoipTalker scene component.
New microphone capture component (New Audio Engine Only) (Windows Only)
We have implemented a Microphone Capture Component which allows:
- Feeding audio directly into a game for use with the Audio Mixer
- Driving in-game parameters using real-time amplitude data.
- Processing audio with Source Effects or Submix Effects.
Currently, this is only for local client microphone capture and is not connected to the VoIP system
While it doesn't support recording microphone output directly to assets at the moment, there are opportunities 3rd party plugin developers to take advantage of data from a microphone input.
Get real-time audio amplitude as Blueprint data (New Audio Engine Only)
Added ability to get amplitude envelope data, directly following audio components and synth components through BP delegates.
Pre and Post Source Effect Bus Sends Types (New Audio Engine Only)
We have added Pre- and Post- Source Effect Bus Sends types allowing designers to send Source audio to Source Busses both before and after Source Effect and Attenuation processing.
Added support for Multichannel file import
We now support importing interleaved multichannel wav files directly into the browser (in addition to the legacy method using multiple mono files).
Sample rate control on synths
We have added the ability for synthesizers to modify their requested sample rate on initialization allowing synth sources to specify their own sample rate.
Improved cross-platform audio plugins (New Audio Engine Only)
We now allow multiple audio plugin source settings so users can have different settings for different plugins which is especially useful for designers working on multi-SKU products where different plugins will be used on different platforms.
Adding a panner source effect
This Source Effect will allow designers the ability to drive stereo panning values from Blueprints as a Source Effect. This allows you to change the pan of a mono or stereo 3d or 2d source.
New: Physics Improvements
Several improvements and optimizations have been made to the physics system to make simulating physical objects easier and efficient!
- Added “Show Only Selected Constraints” to the physics asset editor
- Added Ctrl+T keyboard shortcut to toggle between Body and Constraint Selection
- Added “Set Use CCD” function for changing Continuous Collision Detection option at run-time
- Saved BodyInstance runtime memory – Removed FBodyInstance::ResponseToChannels_DEPRECATED property from non-Editor Builds to save memory.
- Added functions to SpringArmComponent to get relevant status of the collision test.
- Added the ability to profile scene query hitches – Using p.SQHitchDetection console variable
- Exposed some PhysicsAsset functionality to be more extensible in C++
- Improved information in HitResult in case of initial overlap – ImpactPoint will now contain information from PhysX on deepest overlap point.
- Engine now compiles cleanly without PhysX – Can set WITH_PHYSX define directive to 0 and compile without errors.
New: Automatic Code Signing on iOS
Automatic Code Signing (ACS) when using Xcode has now been introduced in UE4.19. ACS plus a Team ID allows a package to be created and signed for an iOS project. This also works when there are no mobile provisions pre-installed on the Mac but the Mac must contain the certificates. A development team will still need to add iOS devices to their Apple account in order to deploy their apps, but they will no longer need to download and install mobile provisions on to a Mac if they use this feature.
New: Android 4.19 Improvements
- Added Proxy network support.
- Added a new pop up box that will inform you if the mobile project you are running is using an unsupported Texture format and also let you know what texture formats are supported
- Added NDK and SDK platform validation before you compile and pack your project to let you know sooner if the build has failed or not.
New: Folder Favorites for the Content Browser
You can now mark any folder in your Content Browser as a “favorite” to make it easily accessible! When a folder is marked as a favorite, it will populate under the new “Favorites” section located above the folder tree. Simply right-click to add or remove folders from this section.
The “Favorites” section of the Content Browser can be enabled/disabled using the View Options for Show Favorites. This feature is enabled by default.
The full list of release notes is available on the forums.