How Evasion Pushes VR Shooters Forward with Innovative Combat and High-Production Values

While there are many VR shooters, Evasion, by developer Archiact, incorporates numerous elements that make it stand out. The sci-fi space shooter offers an action-packed campaign with four playable character classes, innovative combat, full-body inverse kinematics, and high-production values coupled with online co-op gameplay. Powered by Unreal Engine 4, Evasion does all of this with a level of polish that is rare in a VR game made by an indie studio. We got a chance to interview several members from Archiact to learn more about how they were able to create one of VR’s most compelling shooters. 
 

Approaching the idea of designing a VR title, the company really thought about what made the medium unique and how they could leverage its strengths to make something gripping. Lead Game Designer Ian Rooke asserted, “The biggest difference is that in VR, you get to use your body to physically move in your space. You can dodge, duck and use all your reflexes instead of just your thumb dexterity. So developing a shooter in VR means you want to design for this gameplay. The more players get to move, the more immersive it becomes.” 

While VR introduces a heightened sense of immersion coupled with new mechanics, Rooke notes that it poses new developmental hurdles, “There are also many challenges to overcome. You are always mindful of frame-rate and camera motion to ensure players don’t get sick, and you want to try to make sure that players’ movements in game match one-to-one with their body movements. If they swing their arm, they expect that to match perfectly in game,” Rooke explains. Failing to do so can make combat feel clunky and break immersion. The lead game designer continues, “This can be tricky in situations where players are dual-wielding two controllers, but in-game, they’re holding a two-handed weapon, or in melee games, when a player slashes a solid object, nothing stops their real arm’s motion, but in-game you’d expect the blade to meet some resistance on impact.” Rooke adds, “So there’s lots of prototyping and trial and error. This is not that different than traditional console development, but it can be a longer process before you’re happy with your mechanics, and you might have to go back to the drawing board more often than you’d prefer.”

Infusing Influence

Combining time-tested gameplay with modern tech, Evasion draws inspiration from arcade classics like Galaga and Space Invaders. “It was the concept of dodging and blocking projectiles in VR that we liked. We didn’t want to simply soak up damage from instant-hit weapons. It’s really fun to navigate a hail of lasers flying your way. So we looked at old-school shooters as well as more modern bullet-hell games for inspiration,” Rooke stated, adding, “This gameplay marries well with high-intensity, fast-paced shooter combat featured in games like Doom and Destiny. The idea is to throw overwhelming odds at you while providing you with over-the-top weapons to fend off the swarms of enemies,” Rooke continued.

Players will be able to wield these over-the-top sci-fi weapons as one of four “Vanguard” classes, which are basically elite super soldiers. As Rooke notes, “You’re almost unstoppable as most enemies on their own do not provide a big challenge,” but the adage “strength lies in numbers” certainly applies here with Rooke adding, “there are so many of them and they’re relentless.” 

Block Party

In prototyping the insect-like alien enemies, known as the Optera, Archiact borrowed a page from VR shooter Space Pirate Trainer by having a few flying drones shoot projectiles at players. Rooke adds, “Then we thought it would be fun to not only dodge them, but also block them with a shield.” Thus, the inclusion of a shield became a core defensive mechanic of the game. Rooke continues, “It seemed like a natural thing to try. The loop of dodging, ducking, blocking, and shooting was simple and fun.” Rooke expands on how the gunplay and weaponry evolved from here, “As we polished the mechanic, it became more and more fun. We decided to give the player a few weapon pickups as temporary power-ups. Players could grab weapon cores out of the air, similar to our [current] power cores and health cores, each one providing a more powerful weapon with limited ammo. Once the ammo is expended, your default weapon returns. The weapon power-ups included a spread shot, burst fire, auto fire, laser, chain lightning orbs, and a slow moving nuke. This was our demo — one class with multiple weapon power-ups.” 

Stay Classy

While this prototype started with a single character, after demoing an early build, Archiact found that testers wanted different classes that would fit varying playstyles and archetypes. Rooke explains, “Some people said they wanted to be more of a support or healer class, while others still wanted to destroy everything in front of them. So we took what we liked best about the various weapons and used them as a starting point for the four classes. The spread shot turned into the Warden’s primary fire, while the nuke was nerfed down and used as his grenade launcher. The laser and default blaster inspired the Striker, while the burst fire inspired the Surgeon. And, of course, the chain lightning orbs gave birth to the Engineer class. Each class has a unique way to finish off the enemy with a Tether Lash mechanic, and each also has a unique support buff that’s applied while they’re healing their teammate [online].”

With four distinct character classes to choose from, Archiact had to ensure each of the Vanguard were fun and balanced. Rooke notes, “There’s the DPS (damage-per-second) output of each class to watch, while giving various shield sizes and health values to each class. The Warden has the most health and largest shield, and deals a ton of damage up close, but is less effective at long range. The Striker has fast and precise shots, and can strafe faster than the other classes, but her shield is the smallest, and she has the smallest health pool.”

Regardless of which class players pick, they’ll be confronted with several campaign missions chock full of enemies to overcome. Developer Archiact honed in on VR’s ability to provide player movement agency as a focal point for gameplay and challenge. “The way to succeed is to fight really hard like you would in a game of paintball. Once you get used to moving and dodging and being mindful of every projectile flying your way, it will click,” Rooke stated, adding “We made mission one exciting, but not overly challenging. Players can take their time to get used to their weapons and become accustomed to taking advantage of their charge shots and tether-lash mechanics to finish enemies off. Mastering the loop of destroying enemies and pulling in power cores to level up your weapon is key. By mission two, the action starts to get more intense. This mission is like graduation from training. If you can survive this mission, you should be ready for the rest of the campaign.” Rooke adds, “The enemies get progressively harder as the ‘elites’ are introduced in the later missions, and some boss battles add some tough spikes. With only one difficulty mode (at launch) the key is to get good at the game in the first couple of missions. Retrying them a few times is acceptable and expected until you get the hang of it.”

Adding to the immersion of the missions are the game’s destructible environments. Archiact used UE4’s integration of the Apex destruction system to incorporate this. Archiact Software Engineer Thomas Edmunds noted the benefits of this approach, “[It] not only allowed us to heavily customize how destructibles look, but also to optimize them for different platforms and LODs (levels of detail).” Edmunds added, “This was important because destructibles can be very expensive and we did not want to sacrifice the ‘cool factor’ for performance.”

Prime Performance

While Evasion features high-production values with great animations and detailed backgrounds, the road getting there wasn’t easy considering the indie developer only had five artists. This issue is compounded by the fact that the studio needed to optimize the game to meet VR’s steep performance requirements. Not only do VR games need to be rendered at a high resolution, but they need to run silky smooth, or judder can occur. This can cause motion sickness for certain players. Archiact Senior Modeler Austin Huntley elaborates, “We had to be very diligent about staying on [performance] budget. Running on the PS4 in VR at 60 FPS constant requires you to look closely at every aspect of your game in detail to cut down and minimize performance costs. You have to make trade-offs and find a lot of creative solutions to problems. Transparency is a good example. We created shields with thin faded grids to give the illusion of a transparent energy shield instead of a large plane.” 

To meet VR’s steep performance demands, Archiact had to really think outside the box. For instance, Evasion features a level with an open outdoor environment that features a lot of bullets and enemies, which can create a draw-call nightmare. To overcome this, Huntley explains, “We used a lot of mesh instancing as well as shared atlas materials to reducing the amount of both material and mesh draw calls.”

Intelligently synergizing optimization with game design was another elegant move Archiact made. Huntley elaborates, “Early on, we made targets for enemy performance and the cost of any combination of enemies on screen.” By thinking ahead in this regard, the senior modeler remarks, “This helped our enemy performance stay consistent and more predictable in any combat situation by limiting how many could be spawned based on this budget.”

The game’s visuals and immersion are enhanced due to Evasion’s use of full-body avatars. This is noteworthy considering that, with only three points of contact, many other VR games simply opt to render a virtual head and floating hands. To achieve a believable full-body, Archiact leaned on inverse kinematics (IK) by IKINEMA, but Edmunds added that “UE4’s versatile animation Blueprints allowed us to layer and blend locomotion and detail animation, such as trigger pulls with the IK model.” Considering Evasion supports traditional VR motion controllers and singular peripherals like PlayStation VR’s Aim Controller, this implementation was particularly helpful with Edmunds adding, “It also allowed us to support one-handed and two-handed animation sets for our different platforms.” 

While maintaining a high, consistent framerate is paramount to mitigating simulation sickness, some players may feel nauseous by the use of free movement. This is an undesirable effect that stems from joystick locomotion which causes the eyes to be out of sync with one’s inner ear. Thankfully, Evasion offers numerous movement methods for those who want your standard run-and-gun action and for those who have yet to get their “VR legs.” As Rooke notes, “Everybody is different and there’s no getting around that when it comes to VR. Some people have iron stomachs and some don’t. Instead of declaring that we’re catering to a specific crowd, we thought it would be best to provide robust accessibility options so everyone can feel comfortable and ‘at home’ in our game. More and more people want the authentic experience of running around in VR like they would in a traditional game, so of course we delivered a free movement option.” To ensure that this method was as friendly to stomachs as possible, Archiact employed a few tricks, “The key to making this option comfortable is to keep the camera motion constant and smooth. Strafing and reversing is slower, which is what your brain naturally expects. Most important, this helps prevent nausea,” Rooke stated. 

For those that can’t handle this free motion method at all, Archiact implemented an innovative dash-step option. “It works really well as an alternative,” Rooke says, adding, “It’s like little mini jumps forward instead of a gliding camera motion. Between these two options, most people can play the game comfortably.” As a more inventive, immersive option, the developer also incorporated a mode that allows players to jog in place. “It’s similar to free move, but requires an up and down motion from the player’s head as if they’re jogging on the spot.” This mechanic allows the inner ear to more closely align with what the eyes see and Rooke asserts, “This makes it feel like you’re actually running around in the world and further helps to reduce discomfort.” Rooke exclaims, “It’s also a fun way to get exercise.” 

Making It Unreal

As an engine for virtual-reality production, Edmunds praised UE4, stating, “Unreal Engine 4 is a great choice for VR development, since it provides you with a complete VR framework to work within, while allowing you the freedom to change things to suit your projects needs.” The software engineer continues, “Each VR platform’s subsystem is nicely contained, and totally open for changes once you hit the inevitable weird ‘edge case’ as your project progresses.”

Edmunds highlighted Blueprints coupled with the consistency and extensibility within the engine that eased development, “Having all sorts of tools integrated right in the engine makes workflows so much faster. Even the destruction assets and cloth assets have tools in the editor, which was incredibly helpful.”

The studio used Blueprints “extensively” exclaimed Software Engineer Jake Moffatt, “Many of our systems are highly customizable within a Blueprint’s default values, using UPROPERTIES to surface complex data structures that are easy for designers to use.” The software engineer added, “We also made great use of Blueprints for scripting our missions. We have many custom nodes for stringing together mission-specific events, including many that use the Blueprint Async Action pattern, which we found kept our mission scripts very intuitive to read.”

With online co-op being a major feature of the game, Archiact leaned heavily on Unreal Engine 4’s networking features, “Our team made great use of the UE4 Network Profiler tool during development to ensure that we weren’t using excessive amounts of bandwidth,” Moffatt stated.  

Considering Evasion is available across PlayStation VR, Oculus, and Steam, Edmunds noted how UE4 made the game easier to port, “Unreal Engine 4 nicely abstracts away many of the platform differences. In VR development, however, some of these differences require different gameplay systems that translate to a need for ‘un-abstracting’ certain things. Handling all the different input systems, and each platform’s own requirements for VR, was a significant challenge that was made manageable by Unreal’s subsystem framework.” 

Interested in experiencing Evasion for yourself? The game is currently on sale in celebration of this week’s Steam sale event. It’s also available on the Oculus and PlayStation stores. For more information on the game, check out www.evasionvrgame.com and follow the title on Twitter and Facebook @evasionVR.

If you would like to experiment building your own VR game, download Unreal Engine for free today.

Technology Sneak Peek: Advances in Real-Time Ray Tracing

At SIGGRAPH 2018, we saw another advance toward real-time ray tracing with Unreal Engine. A joint presentation by Epic Games, NVIDIA, and Porsche resulted in “Speed of Light”, a real-time cinematic showcasing the current state of this developing techno…

Upgrading Quality within a Fast Turnaround: “ICI Laflaque” Gets an Unreal Facelift

Vox Populi Productions, the team behind award-winning political satire show ICI Laflaque, is no stranger to breakneck speeds in production. The half-hour animated show, much of it newly produced each week based on recent news, has had a 7-day turnaroun…

Unreal Engine Wins Technology & Engineering Emmy® for Animation Production

The National Academy of Television Arts and Sciences has awarded Epic Games with the first Technology and Engineering Emmy for Unreal Engine in the 2017-2018 category, “3D Engine Software for the Production of Animation.” We couldn’t be more thrilled w…

Making a Real-Time Design Tool: McLaren Automates CAD Import with Unreal Studio

McLaren Design, known for its sleek, fast cars, is naturally at the forefront of technology. Recently, the design team collaborated with Epic Games to develop a design tool in Unreal Engine, an application to aid its designers in visualizing configurat…

Unreal Engine 4.21 Released

What’s New

Unreal Engine 4.21 continues our relentless pursuit of greater efficiency, performance, and stability for every project on any platform. We made it easier to work smarter and create faster because we want your imagination to be the only limit when using our tools. And we battle-tested the engine on every platform until it met our developers’ high standards so your project will shine once it is ready for the masses.

We are always looking for ways to streamline everyday tasks so developers can focus on creating meaningful, exciting, and engaging experiences. Our industry-leading Niagara effects toolset is now even more powerful and easier to use, enabling you to dream up the next generation of real-time visual effects. You can build multiplayer experiences on a scale not previously possible using the now production-ready Replication Graph functionality. Iterate faster thanks to optimizations with up to a 60% speed increase when cooking content, run automated tests to find issues using the new Gauntlet automation framework, and speed up your day-to-day workflows with usability improvements to the Animation system, Blueprint Visual Scripting, Sequencer, and more.

We strive to make it possible for your creations to be enjoyed as you intended by everyone, everywhere regardless of the form factor they choose. Building on the previous release, we have added even more optimizations developed for Fortnite on Android and iOS to further improve the process for developing for mobile devices. Available in Early Access, Pixel Streaming opens a whole new avenue to deploy apps in a web browser with no barrier to entry and no compromise on rendering quality. We have also improved support for Linux as well as augmented, virtual, and mixed reality devices.

In addition to all of the updates from Epic, this release includes 121 improvements submitted by the incredible community of Unreal Engine developers on GitHub! Thanks to each of these contributors to Unreal Engine 4.21:

Aaron Franke (aaronfranke), Adam Rehn (adamrehn), Adrian Courrèges (acourreges), aladenberger, Alan Liu (PicaroonX), Cengiz Terzibas (yaakuro), Cerwi, Chris Conway (Koderz), cmp-, Damianno19, Deep Silver Dambuster Studios (DSDambuster), Dorgon Chang (dorgonman), DSCriErr, Dzuelu, Eliot (BillEliot), Erik Dubbelboer (erikdubbelboer), fieldsJacksonG, Franco Pulido (francoap), Frank Stähr (phisigma), George Erfesoglou (nonlin), Hao Wang (haowang1013), Henri Hyyryläinen (hhyyrylainen), Homer D Xing (homerhsing), IlinAleksey, Jacob Nelson (JacobNelsonGames), Jerry Richards (r2d2Proton), Jesse Fish (erebuswolf), Josef Gluyas (Josef-CL), Joshua Harrington (thejhnz), Kalle Hämäläinen (kallehamalainen), KelbyG, Layla (aylaylay), LizardThief, Lucas Wall (lucaswall), Mai Lavelle (maiself), malavon, Marat Radchenko (slonopotamus), Marat Yakupov (moadib), Marco Antonio Alvarez (surakin), Markus Breyer (pluranium), marshal-it, Martin Gerhardy (mgerhardy), Mathias Hübscher (user37337), Michael Kösel (TheCodez), Morva Kristóf (KristofMorva), Muhammad A. Moniem (mamoniem), Nick Edwards (nedwardsnae), nrd2001, Oliver (oliverzx), phoenxin, projectgheist, Rene Rivera (grafikrobot), Rick Yorgason (Skrapion), Riley Labrecque (rlabrecque), Sahil Dhanju (Vatyx), Sam Bonifacio (Acren), scahp, Sébastien Rombauts (SRombauts), Tom Kneiphof (tomix1024), Troy Tavis (ttavis), Truthkey, UristMcRainmaker, Wiktor Lawski (wlawski), yhase7, Zeno Ahn (zenoengine)

Major Features

Niagara Platform Support and Usability Improvements

In our continuing effort to provide industry-leading effects tools, Niagara has received an expanded feature set, substantial quality of life improvements, and Niagara effects are now supported on Nintendo Switch!

GPU-Only Texture Sampling in Niagara

You can now sample a 2D texture or a pseudo-volume 2D texture in your particle scripts! Create amazing effects such as rendering the scene’s depth, color and normal information using a Scene Capture Actor and use that to reconstruct the environment within a Niagara particle system with the particles’ potential and kinetic energy visualized as emissive light.

Check out the Niagara level in the Content Examples project to see how this feature works!

Niagara Skeletal Mesh Data Interface Improvements

There are new functions you can use in the Skeletal Mesh Data Interface enabling direct sampling of a Skeletal Mesh’s vertex data as well as access to specific Bones or Sockets on the Skeletal Mesh.

Ribbon Particle Performance Improvements

Ribbons now generate the ribbon geometry on the GPU instead of the CPU, improving overall performance.

GPU Simulation Support in Niagara

GPU simulation of Niagara effects is now supported on all non-mobile platforms.

Simplified System and Emitter Creation

Niagara now includes friendly dialogs that make creating systems and emitters easier than ever! You can create new emitters and systems from a curated set of templates to speed up development and ensure best practices.

 

 

 

 

 

Pendulum Constraint

This constraint solves with physics forces, optional spring drivers, and includes potential energy calculation. You can now create exciting, dynamic effects such as spawning particles with inherited velocity if the energy exceeds a specified threshold:

Module Additions and Improvements

  • Generate and receive death events
  • Now factoring mass into multiple modules
  • New SampleSkeletalMeshSkeleton, SampleSkeletalMeshSurface, SkeletalMeshSkeletonLocation and SkeletalMeshSurfaceLocation modules to complement enhancements to the Skeletal Mesh Data Interface
  • New AddVelocityInCone module
  • New Force modules: FindKineticAndPotentialEnergy, GravityForce, SpringForce and multiple usability tweaks to other forces
  • New KillParticlesInVolume module
  • New SpriteRotatationRate module
  • New RecreateCameraProjection module for using render targets and camera transforms to turn scene captures into deformable particle systems
  • New modules for sampling textures: SamplePseudoVolumeTexture, SampleTexture, SubUV_TextureSample, and WorldAlignedTextureSample
  • New utility modules for temporal interpolation and frame counters
  • Many new dynamic inputs and functions

New: Replication Graph

The Replication Graph Plugin makes it possible to customize network replication in order to build large-scale multiplayer games that would not be viable with traditional replication strategies. For example, Epic’s own Fortnite Battle Royale starts each match with 100 players and roughly 50,000 replicated actors. If each replicated actor were to determine whether or not it should update across each client connection, the impact on the server’s CPU performance would be prohibitive.

The Replication Graph Plugin solves this problem by offering an alternate strategy geared specifically for high-volume multiplayer games. This works by assigning actors to Replication Nodes, which store precalculated information that clients can use to retrieve lists of actors that need to be updated, saving the CPU of recalculating the same data for many clients on every frame. In addition to the standard nodes that ship with the Engine, developers can write their own nodes to fit the specific needs of actors within their games.

New: Optimizations for Shipping on Mobile Platforms

The mobile development process gets even better thanks to all of the mobile optimizations that were developed for Fortnite’s initial release on Android, in addition to all of the iOS improvements from our ongoing updates!

Improved Vulkan Support on Android

With the help of Samsung, Unreal Engine 4.21 includes all of the Vulkan engineering and optimization work that was done to help ship Fortnite on the Samsung Galaxy Note 9 and is 100% feature compatible with OpenGL ES 3.1. Projects that utilize Vulkan can run up to 20% faster than the same project that uses OpenGL ES.

Config Rules System for Android

The Android Config Rules system can now be used to help catch issues very early in a project start up process. This tool allows for quickly checking for device support and providing either a warning or error dialog to the user if there are issues discovered, such as an out of date driver or unsupported GPU. Any variables set may be queried later in C++ with FAndroidMisc::GetConfigRulesVariable(TEXT(“variablename”)).

To use this system, a configrules.txt rules file is optionally placed in your project’s Build/Android directory and UPL is used to add a Gradle task to use the ConfigRulesTool to compress and optionally encrypt it during packaging the APK. More details can be found in the engine documentation.

Program Binary Cache for Android

The Program Binary cache can be used to help improve Shader loading performance and also reduce hitching due to Shader loading on Android devices. The Program Binary cache works by generating optimized binary representations of Shader programs on the device which are used when loading shaders during subsequent runs. Loading Shader programs from optimized binaries can also dramatically decrease Shader loading times. The Program Binary cache must be used in conjunction with the Shader Pipeline cache tool as it will populate the Program Binary cache during the initial run of your application. To enable the Program Binary cache in your project, you will need to add the following command to your AndroidEngine.ini or Device Profile.

  • r.ProgramBinaryCache.Enable=1

Note: Some devices do not support the required program binary extension, such devices will fallback to the previous behavior.

Emulated Uniform Buffers on Android

You can now use Emulated Uniform Buffers for projects that target the OpenGL ES3.1 feature level, significantly reducing memory usage and improving rendering performance depending on your application complexity. Emulated Uniform Buffers have also been optimized to help reduce the size of the data that needs to be transferred to the GPU when your project is being packaged. To enable Emulated Uniform Buffers when using the OpenGL ES3.1 feature level, add the following line to your project’s DefaultEngine.ini under [/Script/Engine.RendererSettings]:

[/Script/Engine.RendererSettings]
OpenGL.UseEmulatedUBs=1

CPU Thread Affinity Control on Android

The ConfigRules system can register whether or not to use a supplied little core affinity mask. If enabled, the following threads will use little cores which improves battery life and evens out performance since they won’t switch between big and little cores causing possible hitches: render, pool, taskgraph, stats, taskgraph background, async loading. For details on how to set this up, see the Config Rules documentation.

Improved GPU Particle Simulation Performance on Mobile

Mobile particle effects that utilize the GPU for particle simulation have been significantly improved. You now have the option of reducing the memory usage for GPU particle simulation by limiting the maximum number of simulated particles that can be used. By default the maximum number of GPU particles that can be simultaneously simulated is set to around one million particles which will use around 32 MB of memory. You can adjust the maximum number of particles to use by adding the following code to your project’s DefaultEngine.ini under [/Script/Engine.RendererSettings]:

[/Script/Engine.RendererSettings]
fx.GPUSimulationTextureSizeX=512
fx.GPUSimulationTextureSizeY=512
  • Setting the value from 512 to 256 will reduce the memory footprint to around 8 MB.
  • The SimulationTextureSize size has to be a power of 2.
  • These improvements are especially apparent on devices that use the ARM Mali GPU.

Dithered LOD Transitions

Dithered LOD transitions are now supported on mobile platforms. When enabled, objects with Materials that have Dithered LOD transitions option enabled will now fade from one Level Of Detail (LOD) to another in an almost seamless manner. By default support for Dithered LOD transitions is disabled for mobile platforms. To enable it, go to Project Settings > Rendering > Mobile and then check the Allow Dithered LOD Transitions option.

Note: Materials that have Dithered LOD transitions enabled will be rendered as Masked Materials. This could have a negative performance impact on mobile platforms. We recommend enabling this effect only on Masked Materials.

New: Cooker Performance

The cooking process has been optimized resulting in up to 60% reductions in cook times! Low-level code now avoids performing unnecessary file system operations, and cooker timers have been streamlined. Handling of unsolicited Assets (with regard to Asset dependencies) has also been refactored to scale better. These changes are most pronounced on larger projects (projects with more than 100,000 Assets).

New: Pixel Streaming (Early Access)

Run a packaged Unreal Engine application on a desktop PC in the cloud, and stream the viewport directly to any modern web browser on any platform! Get the highest-quality rendering in lightweight apps, even on mobile devices, with zero download, zero install.

A viewport rendered by Unreal Engine, embedded within a web UI. Images and models courtesy of McLaren.

You can broadcast a single game session to multiple viewers by simply sharing a link, or send each connecting user to their own separate game session.

The web page that hosts the Engine viewport automatically sends keyboard, mouse and touch events back to the Engine. You can also customize the page with your own HTML5 UIs, using custom JavaScript events to trigger and respond to gameplay events over the wire.

For details, see Pixel Streaming .

New: Animation System Optimizations and Improvements

The Animation System continues to build on its best-in-class features thanks to new workflow improvements, better surfacing of information, new tools, and more!

Animation Compression Updates

Animation Compression times are significantly reduced by using a whitelist of optimal codecs to avoid trying permutations that are unlikely to be selected which greatly reduces the number of codecs we attempt to compress with. On multicore systems, most of the codecs now evaluate in parallel during automatic compression, further reducing the time it takes to compress an animation sequence.

The following updates were made to the Animation Compression Stat Dialog window:

  • Fixed bugs that would cause dialog to show incorrect results
  • Added compression time stat
  • Added number of compressed animations
  • Added tracking for animation with largest average error
  • Added tracking of worst 10 items instead of just worse
  • Better labeling on dialog
  • Pass through FBoneData array more instead of recalculating

Please see Compression for more information.

Animation Notify Improvements

New Animation Notifies have been added that enable you to manage the state of dynamics and cloth simulations. We have also updated Notify add/replace menus to use class pickers for better searching of BP and native notifies. To add a Notify, right-click on a Notifies track, then under Add Notify, select the type of Notify you wish to add.

Please see Animation Notifications (Notifies) for more information.

Maintain Original Scale of Root Motion

Added Use Normalized Root Motion Scale option to maintain the original scale of Root Motion. This option is on by default and is the functionality that existed prior to this Engine release. Choosing to disable this option will now use the final blended animation instead.

Please see Enabling Root Motion for more information.

Added Caching and Autocomplete for “Sync Marker” Names

When creating Sync Markers, you can now access any Sync Markers assigned to the Skeleton from the Existing Sync Markers menu option. Entering text into the search box will also filter out Sync Markers based on your text entry.

Animation Sequence Framerate

The framerate of Animation Sequences is now displayed in the Animation Tools viewport and Content Browser tooltip.

Enable Auto Blend Out on Anim Montages

Anim Montages now have the option to enable or disable Auto Blend Out. This option is enabled by default, however you can disable it which won’t auto blend out the Montage and will keep the last pose.

Please see Montage Properties for more information.

CCDIK Skeletal Control Node

Use the new CCDIK (Cyclic Coordinate Descent Inverse Kinematics) Skeletal Control Node for a lightweight, IK algorithm suited for real-time calculation of relatively short IK chains, such as shoulder to fingertip.

Please see CCDIK Skeletal Control Node for more information.

Set Master Pose Component Force Update

The Set Master Pose Component function has a second input pin called Force Update that can be used to skip updating all runtime info if that info is the same as the Master Component or force the updating of the runtime info. This only applies to the registration process as that can be serialized, at which point it will need to refresh all runtime data.

Please see Master Pose Component for more information.

Miscellaneous Improvements and Updates

  • Live Animation Blueprint Recompilation is now non-experimental
  • Local Space is now the default Coordinate Space for Animation Editors
  • A notification is now displayed in the Animation Tools viewport when a min LOD is being applied.

New: Gauntlet Automation Framework (Early Access)

The new early access Gauntlet automation framework enables you to automate the process of deploying builds to devices, running one or more clients and or/servers, and processing the results.

You can create Gauntlet scripts that automatically profile points of interest, validate gameplay logic, check return values from backend APIs, and more! Gauntlet has been battle tested for months in the process of optimizing Fortnite and is a key part of ensuring it runs smoothly on all platforms.

Gauntlet provides new profiling helpers that can record critical performance values between two points in time in order to track missed-Vsyncs, hitches, CPU Time, GPU Time, RHI Time, Draw Calls and more. Gauntlet also provides helper functions to gather these from logs so you can generate warnings, store them in databases, or create trend lines. All of the info captured during the test is available to be output into reports any way you want.

An example of a simple report is shown below:

Each Gauntlet test is a C# script that expresses a simple configuration for your test – how many clients, how many servers, and what parameters to pass. Gauntlet takes care of allocating machines from a pool, deploying and running builds, checking for common errors such as crashes, asserts, or timeouts, and collecting log files and other artifacts.

New: Submix Envelope Follower

Users of the new Unreal Audio Engine can now set an Envelope Follower Delegate on their Submixes allowing amplitude analysis of individual channels for that submix. This will help users power visualizations and Blueprint Events based on the amplitude characteristics of their Submixed audio.

New: Filter Sound Submix Effect

Users of the new Unreal Audio Engine now have the option of adding a multimode filter to their Submixes allowing dynamic filter effects on a single Submix.

 

New: Sound Submix Effect Reverb Dry Level

The Submix Effect Reverb in the new Unreal Audio Engine now supports Parallel Wet and Dry Levels allowing users to dial in specific Wet/Dry ratios making the Effect usable as an Insert-Style Effect as well as a Send-Style Effect.

New: Optimizations to Source Effect API

The Source Effect API in the new Unreal Audio Engine has been optimized to process a full buffer of audio rather than frame-by-frame. This will allow Source Effects to process more efficiently than before.

New: Linux Defaults to Vulkan Renderer

Linux now uses Vulkan as the default renderer when available. In the event the API cannot be initialized, the Engine will fall back OpenGL without notification.

From the Project Settings, you can use the Target RHIs to add or disable a particular RHI or use command line switches -vulkan and -opengl4 to disable the fallback.

New: Linux Media Player

You can now use the bundled WebMMedia plugin to play back .webm VPX8/9 videos on Linux platforms.

New: Linux Crash Report Client GUI

We’ve added support for the Crash Reporter GUI on Linux so you can help us continue to improve support for Linux platforms. Please submit reports when they occur, even repeated ones! It helps our engineers assess the frequency and learn what circumstances cause the crash to happen.

New: Professional Video I/O Improvements (Early Access)

We continue to make it easier to get video feeds into and out of the Unreal Editor over professional quality SDI video cards. You can now work with the same Unreal Engine Project across multiple computers with different hardware setups, without changing any configuration settings in the Project.

Create a MediaProfile on each machine, and set it up to handle the video card and formats that you need to use on that computer. You can also override the Project’s timecode and genlock sources from the same panel:

When you combine the Media Profile with the new Proxy Media Source and Proxy Media Output Asset types, you can automatically redirect input and output channels between the Project’s media content and the settings in your Media Profile. When you switch to a different Media Profile — for example, on a different computer with a different media card or different wiring setup — the input and output channels from that machine’s hardware are automatically routed through the proxies so that you don’t have to change any content in your Project.

For details, see Using Media Profiles and Proxies .

In addition, this release adds:

  • A dockable Timecode Provider panel (Window > Developer Tools > Timecode Provider) that shows the Unreal Engine’s current timecode and the source that timecode is coming from:
  • Support for 10-bit input, audio I/O and interlaced/PsF inputs.
  • A new Blackmagic Media Player Plugin that supports SDI cards from Blackmagic Design. See the Blackmagic Video I/O Quick Start .

Note: The AJA Media Player and Blackmagic Media Player Plugins are now available through the Marketplace tab in the Epic Games Launcher, instead of being installed automatically with the Unreal Engine. Their source is freely available on GitHub , to give other developers a model of how to develop video I/O plugins on top of the Engine’s Media Core APIs.

New: Geographically Accurate Sun Positioning (Early Access)

In the real world, the sun’s position in the sky depends on the latitude and longitude of the viewer, the date, and the time of day. You can now use the same mathematical equations to govern the sun’s position in your Unreal Engine Level.

This is particularly effective any time you need to simulate the real-world lighting conditions for a specific place on the Earth, such as for a major architectural or construction project. However, this can also be useful for any Level that you want to exhibit realistic sun movements and positioning based on global position and time of day.

For details, see Geographically Accurate Sun Positioning .

New: Static Mesh Processing

We have added several new Static Mesh processing options inside the Unreal Editor. You can now save memory by removing unnecessary UV mappings from your Static Meshes.

In addition, using Python and Blueprint scripts that you run inside the Unreal Editor, you can now:

New: Blueprint Usability Improvements

The Blueprint Graph editor now features “Quick Jump” navigation enhancing the existing bookmark feature by enabling users to save their current location and zoom level in the Blueprint Editor with CTRL + [0-9]. They can then quickly return to that graph at that location and zoom level by pressing SHIFT + [0-9] whenever the Blueprint editor window is open, even when working in a different Blueprint Asset. “Quick Jump” bookmarks persist across Editor sessions, and are local to the user/machine.

Users now have the ability to insert pins before or after a target pin for Sequence nodes via the context menu, rather than only being able to add them onto the end.

Monolithic engine header file exclusion from nativized Blueprint class C++ code is now available as a Project Setting. This can help to reduce the overall code size of the monolithic game EXE file, if code size is an issue. The option can be found at Project Settings->Packaging in the “Advanced” section under the “Blueprint Nativization Method” option. This option is disabled by default to maintain compatibility with existing objects.

New: Improvements to HTML5 Templates

HTML5 projects now use separate HTML, JavaScript, and CSS templates replacing the previous monolithic template file! Custom template files are also supported on a per-project basis:

Copy:

.../Engine/Build/HTML5/GameX.*.template

To:

[project]/Build/HTML5/.

The build process will automatically pick the project’s path or otherwise fallback to the Engine’s version.

This is based on GitHub PR#4780.

New: HTML5 README Files Updated

The HTML5 README file has been split up into multiple README files based on category:

  • Building UE4 HTML5
    • Get Source Files
    • Compiling Support Programs
    • Compiling UE4 Editor
    • Run UE4 Editor
    • Package The Project For HTML5
    • Test The HTML5 Packaged Project
  • Debugging UE4 HTML5
    • How To Dump The Stack And Print From Cpp
    • BugHunting GLSL
  • Emscripten and UE4
    • EMSDK
    • Emscripten toolchain and Thirdparty libraries
    • UE4 C# scripts
    • Test Build, Checking In, and CIS

New: Improved IPv6 Support

Support for IPv4 and IPv6 has been merged into a single socket subsystem, where previously support for each protocol was isolated to a specific subsystem. This allows platforms that used one of the BSD subsystems to support both IPv4 and IPv6 at the same time, and do it transparently to the calling code.

New: DDoS Detection and Mitigation

DDoS (distributed denial of service) attacks typically hinder game servers by flooding them with so many packets, that they are unable to process all of the packets without locking up and/or drowning out other players’ packets, causing players to time out or to suffer severe packet loss which hinders gameplay.

Typically these attacks use spoofed UDP packets, where the source IP is unverifiable. This optional DDoS detection focuses specifically on this situation, detecting/mitigating DDoS attacks based on configurable thresholds of spoofed UDP packets, which do not originate from an existing, known client connection. This is not a guarantee that servers will be safe from all attacks, since it’s still possible that a heavy attack can overwhelm the hardware or OS running the server.

New: Physics Interface Updates

The Physics Interface has been refactored to support an increased ownership of physics objects at the high level. As a consequence of these changes, we have deprecated the Async Scene which was only recommended for use with APEX Destruction. You can still achieve the same visual results using the Sync Scene.

As a result of these changes, much of the physics related C++ code API has changed. Functionally the API is the same and you should be able to use it very similarly to how you currently use it. We’ve made changes to the Physics Interface with the goal of a) reorganizing dependencies into one controlled place, and b) creating a common model for physics interactions when interacting with Unreal.

Please see 4.21 Physics Technical Notes for more information.

New: Pipeline State Object (PSO) Caching

We now support Pipeline State Object (PSO) Caching on Metal (iOS/Mac), DX12 and Vulkan platforms. PSO caching helps reduce any hitches your project might encounter when a Material requires a new Shader to be compiled. PSO Caching creates a list of all the needed Shaders that are required by the Materials which is then used to help speed up the compiling process of these Shaders when they are first encountered by your project. PSO Caching can be enabled in the Project Settings > Packaging section.

To find out more information about how to setup and use PSO caching in your UE4 project, make sure to check out the PSO Caching documents.

New: Physical Lighting Units Updates

We have improved the workflow and usability for Physical Lighting Units based on feedback provided by the community. As part of these updates, the following changes have been made:

  • All light types now display their units type next to the Intensity value.
  • Directional Lights are now displayed in LUX with increased intensity range.
  • Sky Light intensity is now displayed in cd/m2 with increased intensity range.
  • Post Process Auto-Exposure settings can be expressed in EV-100 for an extended range of scene luminance. This can be enabled via Project Settings.
  • The Pixel Inspector can now display pre-exposure for Scene Color. This can be enabled via Project Settings.
  • HDR (Eye Adaptation) Visualization has been refactored in the following ways:
    • HDR Analysis picture-in-picture display over the current scene view allowing adjustments with instant feedback.
    • Visualization is now expressed in EV100.
    • Pixel Inspector-like feedback has been removed.

For additional information, see Physical Lighting Units .

New: Sequencer Event Track

The Sequencer Event Track has been completely refactored so that Events are now more tightly coupled to Blueprint graphs which makes it a much more familiar user-experience and more robust. By utilizing Blueprints and Interfaces, this offers better control and stability compared to the previous implementation which used struct payloads and anonymous named events.

Please see Event Track Overview and Calling Events through Sequencer for more information.

New: Geometry Cache Track (Experimental)

The new (and experimental) Geometry Cache Track allows you to scrub through a Geometry Cache and render it out with frame accuracy.

Please see Using the Geometry Cache Track for more information.

New: Sequencer Audio Bakedown (Early Access)

You can now bake down the audio into a Master Audio Submix from the Render Movie Settings window. The process of baking audio occurs in a separate render pass and exports the audio in the sequence to a single file when you render a movie.

Please see Render Movie Settings for more information.

New: Sequencer Guide Marks

You can now lay down vertical guide marks on the timeline to use for snapping or identifying key points in your timeline.

Please see Using Frame Markers in Sequencer for more information.

New: Windows Mixed Reality Support

Unreal Engine 4 now natively supports the Windows Mixed Reality (WMR) platform and headsets, such as the HP Mixed Reality headset and the Samsung HMD Odyssey headset. To use our native WMR support, you must be on the April 2018 Windows 10 update or later, and have a supported headset. For more information on how to get up and running, see Windows Mixed Reality Development .

 

 

Image courtesy of HP

New: Magic Leap Qualified Developer Release Support

Unreal Engine 4 now supports all the features needed to develop complete applications on Magic Leap’s Lumin-based devices. We support rendering, controller support, gesture recognition, audio input/output, media, and more. For more information on how to be become a developer, please check out https://www.magicleap.com/ .

New: Oculus Avatars

The Oculus Avatar SDK includes an Unreal package to assist developers in implementing first-person hand presence for the Rift and Touch controllers. The package includes avatar hand and body assets that are viewable by other users in social applications. The first-person hand models and third-person hand and body models supported by the Avatar SDK automatically pull the avatar configuration choices the user has made in Oculus Home to provide a consistent sense of identity across applications. For more information, see the Avatar SDK Developer Guide .

New: Round Robin Occlusions

Unreal Engine 4 now supports Round Robin Occlusions. With the newly added vr.RoundRobinOcclusion flag enabled, stereoscopic frames will kick off occlusion queries for one eye per frame using an alternating scheme (i.e. odd frames only kick off queries for the left eye, and even frames only kick off queries for the right). This approach cuts the number of occlusion draw calls per frame by half. In some situations, this improves performance significantly.

New: Platform SDK Upgrades

In every release, we update the Engine to support the latest SDK releases from platform partners.

 

 

  • IDE Version the Build farm compiles against
    • Visual Studio: Visual Studio 2017 v15.6.3 toolchain (14.13.26128) and Windows 10 SDK (10.0.12699.0)
      • Minimum supported versions
        • Visual Studio 2017 v15.6
        • Visual Studio 2015 Update 3
    • Xcode: Xcode 9.4
  • Android:
    • Android NDK r14b (New CodeWorks for Android 1r7u1 installer will replace previous CodeWorks on Windows and Mac; Linux will use 1r6u1 plus modifications)
  • HTML5: Emscripten 1.37.19
  • Linux “SDK” (cross-toolchain):
    • v12_clang-6.0.1-centos7
  • Lumin: 0.16.0
  • Steam: 1.39
  • SteamVR: 1.39
  • Oculus Runtime: 1.28
  • Switch:
    • SDK 5.3.0 + optional NEX 4.4.2 (Firmware 5.0.0-4.0)
    • SDK 6.4.0 + optional NEX 4.6.2 (Firmware 6.0.0-5.0)
    • Supported IDE: Visual Studio 2017, Visual Studio 2015
  • PS4:
    • 6.008.001
    • Firmware Version 6.008.021
    • Supported IDE: Visual Studio 2017, Visual Studio 2015
  • Xbox One:
    • XDK: June 2018 QFE-4
    • Firmware Version: June 2018 (version 10.0.17134.4056)
    • Supported IDE: Visual Studio 2017
  • macOS: SDK 10.14
  • iOS: SDK 12
  • tvOS: SDK 12
To view the full list of release notes, visit our forum or documentation pages.

Real-Time Design Shines at Build:London’18 for Architecture

On October 3, 2018, nearly 300 professionals from the architecture, engineering and construction industries gathered at the Royal Institute of British Architects (RIBA) in London to showcase and discuss real-time technology, and look at some of the exc…

Sci-Fi Story “The Great C” Goes Real-Time with Unreal Engine

Stories by science fiction writer Philip K. Dick have been the basis for numerous films and TV shows like Blade Runner, The Man in the High Castle and Electric Dreams. 

The latest of Dick’s stories to inspire a visual adaptation is The Great C, a post…

Epic Games to Bring Unreal Engine Expertise to Magic Leap’s L.E.A.P. Conference

Epic Games Technical Director of XR Nick Whiting has been announced as one of the keynote speakers for Magic Leap’s first-ever developer conference, L.E.A.P., taking place October 9-10 in Los Angeles. During Wednesday’s keynote, which runs from 9am to …