HTC VIVE™, the leader in room-scale Virtual Reality (VR), today announced new hardware, software, and content offerings that redefine how VR is experienced. With VIVE Pro Eye, a new headset with built-in eye tracking, the high-end VR experience from VIVE Pro just got even better. In addition, VIVEPORT, HTC’s global app store for VR content, announced unlimited subscription access with
When it comes to using animations for marketing and brand engagement, many VR film projects currently on the market focus on providing an immersive one-off experience to captivate viewers in the moment. Rather than a mere afterthought, replayability is an essential ingredient for global VFX and animation studio Redrover, who is exploring fresh ways to engage viewers on a deeper level by combining story, gameplay, and greater interactivity.
Buddy VR – the team’s recent VR film spinoff of its Hollywood animated blockbuster, The Nut Job – recently took home the Best VR Experience Award at the Venice International Film Festival this fall. The project is part of Redrover’s vision to create a new genre of interactive animation, and what makes Buddy VR especially unique is the way it bridges the gap between animated short films and video game experiences.
A virtual interactive friendship
Starring “Buddy,” the loveable blue rat from The Nut Job, this vibrant interactive animation short has you meeting and befriending the little critter in a whimsical story that balances plot and gameplay elements. “We wanted to lead the story through intimacy between the player and character,” explains Chuck Chae, Director for Buddy VR.
Players get to know Buddy through a series of non-verbal interactions like exchanging names, petting, playing musical instruments, and more. It’s a humorous, heartwarming 16-minute interactive experience, and the response from those who have played it is overwhelmingly positive, he adds.
“Simply experiencing VR offers the player an extraordinary experience, and provides deep immersion while wearing VR equipment. However, many VR titles on the market are difficult to enjoy again once they have been played through the first time,” says Chae. “Our goal is to break away from this approach and produce titles that can maintain their replayability throughout lengthy and multiple playthroughs by combining Redrover’s IP and VR technology with interactive elements.”
Optimizing creative potential with Unreal Engine
For this project, overcoming the challenge of creating cohesive story interaction through speechless communication required that the team weave in extra layers of detail and nuance to Buddy’s facial expressions, physical actions, and eye movements. Using Unreal Engine gave the team the tools and additional programming flexibility to craft a level of real-time interactivity and realism that could foster a believable relationship-building experience between players and the furry protagonist, says Chae.
“High-quality graphics and animations are essential for creating speechless interaction, which is the highlight of our product. It was amazing enough that Unreal Engine easily fulfilled our graphical standards, but it also had unbelievable real-time functionalities, allowing us to apply desired animations, edit unnatural or incorrect aspects, and then reapply to view the results all in one sitting,” says Chae, adding that the team was able to minimize production time using real-time rendering.
Optimizing their production workflows using real-time rendering also helped free up more of the team’s time and energy for creativity. “The greatest strengths of Unreal Engine are the ability to quickly make a prototype using codeless Blueprints and the ability to create high-quality graphic productions through real-time rendering,” he says. “By minimizing the workflow of realizing the designs and animations in your head to an actual render, there can be more time to concentrate on the creative aspects.”
Ready to get started with Unreal Engine and Unreal Studio to enhance your creativity today? Download them for free right here.
Survios has been one of the most successful pioneers in the VR space. The Los Angeles-based developer has an impressive resume of critically-acclaimed VR games such as Raw Data and Sprint Vector. With its most recent release, Creed: Rise to Glory, garnering a ton of praise, the studio has not only created one of the best VR boxing games, but one of the best boxing games period. We recently had the chance to interview several members from the team, and in this post, Survios explains how they were able to solve many hard VR problems while producing a knockout title.
Feeling the Friction
Even though VR can offer unparalleled levels of immersion, clipping through opponents when you’re punching through them can be an immersion breaker. It’s a difficult problem to solve; after all, your opponents aren’t really in front of you to provide friction and resistance. This is why many other VR games avoid melee mechanics and instead rely on gunplay and archery for combat.
To overcome this issue, Survios needed to revolutionize melee for VR. Setting the stage, Lead Survios Engineer Eugene Elkin stated, “In our initial prototype, we set out two goals for ourselves: punching had to feel great and getting punched had to be impactful. We decided right away that the game would not be a straight boxing simulator, but a cinematic-inspired boxing experience. Despite a relatively compressed prototyping timeline, we were still able to create multiple gameplay iterations. The result of that investigation stage was the set of technological rules and techniques we dubbed ‘Phantom Melee Technology’.”
Explaining how the system overcomes melee clipping issues, Elkin elaborated, “At all times, there are essentially two separate player avatars that are contextually synced/desynced. One avatar is the representation of the player’s character and is bound by in-game physics like collision, hit reactions, and knockdowns. The second avatar—codenamed ‘Phantom’—always represents the player’s true position.” This separation is quite ingenious as it allows players to punch through opponents without ever feeling like you’re awkwardly clipping through them.
Lead Designer Valerie Allen was inspired to develop this system after reading a sci-fi manga. As Allen explains, “There was a scene in one of the Battle Angel Alita volumes that involved her brain getting overclocked. In that scene, she zipped forward to deliver a punch, only to find herself crashing to the floor because her mental projection of what she was doing was so far ahead of what her body could handle. This is largely how Phantom Melee Technology works.”
Despite having separate avatars, combat never feels disjointed. Allen explains, “After playing around enough, players quickly start to acclimate, and rather than wasting their real-world effort punching through things, they start to treat the avatar’s arms more like their own, and thus react to the position of their virtual opponent like a real-life one.”
While Phantom Melee Tech solves one major VR issue, Survios still needed to deal with players that might try and “break” the game by constantly flailing their arms about, which is neither fun nor realistic to the sport, but may be effective. To solve this problem, Survios incorporates limited stamina. Allen elaborates on this design philosophy, “Throwing a lot of of rapid punches leads to the avatar getting tired, so the player must focus on defending until the avatar’s stamina recovers.” The lead designer added, “The more we tested and tweaked stamina tuning, the more our gameplay started to look and feel like an actual boxing match.” Those with outstanding real-life endurance may balk at the inclusion of a virtual stamina system, but Allen explains, “While the avatar may tire out more quickly than the player does, the player isn’t the one experiencing the debilitating effects of being punched in the face and gut.”
This inclusion of limited endurance also made online PvP more enjoyable. Elkin notes, “The stamina system became a very important tool to encourage players to block and defend, strategically deploy their punches wisely, and treat it like an actual boxing match.” Even though Rise to Glory is not the first VR boxing game, it is the first VR boxing game to feature online play. Early on in development, Creed wasn’t going to feature multiplayer, but Survios knew the package wouldn’t feel complete without it. Adding online PvP to a melee-focused VR game while making it feel immersive and fun is extremely difficult. Elkin elaborates, “Unlike traditional fighting games where moves and abilities are predetermined, it’s extremely hard to predict how real-life players will behave in a PvP setting.” This issue is heightened when you consider that Rise to Glory features full-body avatars. On the networking front, Multiplayer Engineer Eva Hopps added, “The biggest challenge we immediately knew we had to deal with was network lag. Since we couldn’t rely on the usual fighting game tricks to mask or compensate for it, we tweaked Phantom Melee’s fatigue-triggered slowing effect as our way of concealing lag from players.” Even though incorporating online play while creating a new revolutionary VR combat system was no small task, Hopps mentioned that, “for the most part, Unreal made this pretty easy for us.”
To ensure that the boxing felt realistic, Survios enlisted the help of professional boxers early on in development. Not only did the team heed their advice, but they signed up for boxing lessons. “To this day, we have boxing coaches come twice a week to our office for lessons, and that experience was invaluable for our designers and engineers in crafting a realistic boxing experience,” Elkin explained, elaborating, “Our marketing team also worked with Buzzfeed to have an Olympic-level boxer, Mikaela Mayer, play the Career mode on the hardest difficulty setting, and she was blown away at how similar the mechanics were to the real sport.”
To take the game’s realism to the next level and to reward players who really get into the action, Rise to Glory leverages VR’s accelerometers and motion sensors to track how hard players hit. Allen adds, “We check both the distance and speed of the player’s hand movements, and in some cases the angles as well. We tuned the values to require a reasonable amount of force to throw a punch, but not so much that players feel like they always have to punch as hard and fast as they possibly can for maximum impact.” As a result, Creed ends up being a good workout that takes “shadow boxing” to the next level. Even recovering from a knock down requires players to exert physical energy to get back into the fight. While many past boxing games would often force players to quickly tap a button to recover, Rise to Glory does something wholly unique that gives players more agency than ever before. Allen explains, “I really liked the concept of the player getting hit so hard that they experience a sort of barely-conscious tunnel vision, and that it would require physical effort to run back to their body.” Taking a page from the studios’ past VR racing game Sprint Vector, once players are knocked down for the count, they get shot out into a dark tunnel and must swing their arms to run back to their bodies. Considering you can hear the ref audibly count to 10 in the background, it provides a tremendous sense of urgency and heightens the experience in a fun way that only VR can deliver.
While most VR games opt to render just virtual hands, as it can be really hard to figure out where your elbows, torso, and feet are with only three points of contact, to overcome this issue and render players’ full bodies, Survios wrote a lot of code to create believable inverse kinematics (IK). As Elkin notes, “The body IK system has gone through several iterations here at Survios. The current system is in active development and is shared among most of our projects, but is currently a custom animation solver that determines optimal joint location from three tracked inputs.” The lead engineer added, “Luckily, Unreal makes it pretty simple to create new IK Solvers, which is extremely powerful for us.”
Considering the core Rise to Glory development team consisted roughly of nine developers, Survios has impressively been able to solve many difficult challenges with relatively little resources. Elkin attributes much of the success to the passion of his associates, “I’ve never worked on a more self-motivated team: everybody loved the project from the initial prototype and only wished we could have more time to continue working on it. While it may sound cliché, we truly were making a game that we wanted to play.”
With this being Survios’ third Unreal Engine 4-powered VR game, Elkin also attributes much of the studio’s success to improving upon an already strong UE4 foundation, “It’s extremely important for us to reuse technology between projects. Throughout the years and over the course of several projects, we have continued building out our custom toolset in UE4, and that has significantly sped up our dev cycles on new projects. Having access and the ability to change engine source code is also invaluable.” Hopps added, “UE4 also makes it easy to move from project to product with a consistent set of tools, so nothing on our end feels like it’s changing drastically as the project evolves from prototype to retail-ready.”
In terms of specific tools, Elkin had high praise for UE4’s debug stump allocator, “Unreal has great tools for tracking nasty memory stumps and analyzing performance.” As a multiplayer engineer, Hopps praised UE4’s Profiler and noted how “amazingly easy it is to network things in Unreal.”
Continuing the Fight
While Rise to Glory marks Survios’ fourth VR title, Elkin asserts that we’re just at the frontier of VR gaming. “I think that we’re experiencing a time similar to the pioneering days of the ’80s when game developers were exploring, experimenting, and trying basically everything for the first time,” adding, “Right now, developers just don’t know for sure what will work or not in VR; all of the traditional knowledge of game development that has been acquired over decades is just not valid most of the time in VR development. VR is a unique beast and we’re just beginning to scratch its surface–but every day we discover something new, and it’s definitely a very exciting time to be developing in VR.”
Survios recently announced Creed: Rise to Glory’s first content update. Releasing November 27, the free update will feature two new free-play and PvP opponents: Danny “Stuntman” Wheeler and Viktor Drago. Both characters are the two primary antagonists from the upcoming Creed II film. For more information on Creed: Rise to Glory, make sure to visit survios.com/creed.
If you’re interested in creating your own VR game, you can download Unreal Engine for free today.
The National Academy of Television Arts and Sciences has awarded Epic Games with the first Technology and Engineering Emmy for Unreal Engine in the 2017-2018 category, “3D Engine Software for the Production of Animation.” We couldn’t be more thrilled w…
Today Epic Games announced the latest recipients of Unreal Dev Grants, a $5 million fund supporting developers working with Unreal Engine 4 (UE4). This new round awards $800,000 to more than 30 individuals and teams, with no restrictions or obligations to Epic Games. As with previous rounds, these recipients illustrate the wide variety of use cases for UE4, including independent games, interactive visualizations, virtual reality surgical simulators and online learning resources.
“The Unreal Dev Grants program has a simple goal: to help talented developers succeed by letting them focus more on their project and less on their bills,” said Chance Ivey, Partnership Manager at Epic Games. “We’re continually amazed by the range of applications built with UE4 and the potential of so many of these projects; this round includes standouts such as Sojourn by Tierceron, Crab Rave by Noisestorm, and VR Cataract Training Solution by Surgical Mind. Congrats to all of these folks for their vision and persistence!”
The latest round of Unreal Dev Grants recipients includes:
FILM / CINEMA: 100 Flowers of God (working title) by 3rd World Studios – Website
3rd World Studios is the Pakistan-based creator of the first animated feature-length film rendered entirely in UE4, Allahyar and the Legend of Markhor, which was released in February to critical acclaim. This Unreal Dev Grant is meant to accelerate 3rd World’s future film projects.
TOOL / PLUGIN: Anomotion Motion Composer and Anomotion BIK – Website
Anomotion maintains two animation solutions for UE4: Motion Composer, a task-based motion planner which automatically generates precise motion sequences from unstructured animation data; and BIK, an inverse-kinematics system that can model various joint types and define custom constraints for VR avatars, virtual humans and creatures. Anomotion’s solutions have practical applications, from film previs to architectural visualizations. For industrial simulation and shared virtual environments, for example, Anomotion’s technology can be used to populate interactive, adaptive training environments with task-directed virtual characters.
FILM / CINEMA / VR: Awake: Episode One by Start VR – Trailer
Created by Start VR, Awake: Episode One is an interactive cinematic virtual reality experience for HTC Vive and Vive Pro. Awake: Episode One, which uses the latest volumetric capture techniques to bring real-life human performances into VR, officially premiered at SXSW and has been touring the festival circuit ever since. It’s coming soon to Steam.
INDEPENDENT GAME: Black Iris by Hexa Game Studio – Website
From Brazilian indie team Hexa Game Studio, Black Iris is an action RPG that takes inspiration from the Dark Souls series of games and Bloodborne. Black Iris in development for PC and console.
INDEPENDENT GAME / AR: BOT-NET by Calvin Labs – Website
BOT-NET is a game that turns physical space into a first-person battlefield using a mobile device’s AR features. Massive robots fight while the player engages in ground combat with smaller robots. BOT-NET is available in the App Store.
FILM / CINEMA: Cine Tracer by Matt Workman – Steam
Developed by Matt Workman of Cinematography Database, Cine Tracer is a realistic cinematography simulator in which the player operates real world-based cameras, sets up lights, and directs talent within UE4 environments. Matt frequently livestreams Cine Tracer development at https://www.twitch.tv/cinegamedev. Creatives can use Cine Tracer to communicate lighting, cameras and storyboarding, and it’s available in Early Access on Steam.
INDEPENDENT GAME: Close to the Sun by Storm in a Teacup – Website
Developed by Rome-based Storm in a Teacup, Close to the Sun is a first-person horror game that takes place in an alternate version of history in the 1890s aboard a mysterious ship complex created by Nikola Tesla where things are not as they seem. With numerous indie game accolades already under its belt, Close to the Sun is coming to PC and console in 2019.
TOOL / PLUGIN: coreDS Unreal by ds.tools – Website
coreDS Unreal facilitates integration of both High-Level Architecture (HLA) and Distributed Interactive Simulation (DIS) in UE4 games and applications. Users can integrate once and support HLA and DIS without any other modifications to their UE4 application. coreDS Unreal provides an extensive feature set that eases the integration process, allowing for reduced implementation time, flexibility and highly customizable simulation behavior.
INDEPENDENT GAME: Farm Folks by Overgrown – Trailer
Farm Folks is a successfully crowdfunded farming simulator game with a nod to the classic Harvest Moon series. Players can explore Softshoal Island, grow crops, raise livestock, build relationships and more – all the while uncovering the island’s mysteries. Farm Folks, coming to PC, is available for pre-order on Crytivo.
INDEPENDENT GAME / VR: Jupiter & Mars by Tigertron – Website
Jupiter & Mars is an underwater adventure game for PlayStation 4 and PlayStation VR with a powerful message around climate change set in a shocking, future world inspired by ecological events happening now. The player controls Jupiter, a dolphin with enhanced echolocation powers, traveling around the world with AI companion Mars to disable the man-made machinery disrupting marine life, while solving puzzles and encountering magnificent creatures along the way.
INDEPENDENT GAME / VR: Kaisuo by USC Games – Trailer
Kaisuo is a VR puzzle game in which players use fine motor dexterity to solve enigmatic Chinese puzzle boxes and unlock surreal, extraordinary spaces. Originally founded as an undergraduate student project named Lantern (now the name of the development team) at the University of Southern California, Kaisuo has been showcased at events such as the USC Games Expo and Indiecade @ E3, and is in development through the USC Bridge incubator program for full release on the Oculus and Steam stores.
INDEPENDENT GAME: Koral by Pantumaca Barcelona – Steam
Developed by Carlos Coronado, one of Barcelona’s leading UE4 experts, this beautiful PC game takes players on a dive through the underwater world where they play as the current on a mission to revive coral reefs. Solving puzzles heals the reefs and replenishes the ocean’s magic. In addition, Carlos’ new training materials on going from zero to expert in UE4 have marked Udemy’s most successful launch of a Spanish game development course in the site’s history.
FINE ARTS / VR: Lemieux Pilon 4D Art – Website
The renowned duo of Michel Lemieux and Victor Pilon (4D Art) are creating an immersive museum art piece for virtual reality using UE4.
INDEPENDENT GAME / VR: Mini World VR by Scaena Studios – Website
From Korea’s award-winning Chung Ang University 3D VR Lab, Scaena Studios’ Mini World VR is an immersive storytelling experience featuring elaborate hand-animated characters, game-based elements and intuitive interactivity. A cross between a game and a film, Mini World VR can be experienced from the perspective of both player and audience.
INDEPENDENT GAME: Mowin’ & Throwin’ by House Pixel Games – Steam
Available via Steam Early Access, Mowin’ & Throwin’ is a local multiplayer mashup of Bomberman meets Splatoon with a dash of Overcooked. Players control lawn gnomes in a race to wreck their opponent’s yard while keeping their own pristine. Victory goes to the best looking lawn! Mowin’ & Throwin’ is coming to party game collections for Nintendo Switch, PlayStation 4 and Xbox One in 2019.
FILM / CINEMA: Music Videos by Noisestorm – SoundCloud
Irish music producer and artist Noisestorm uses UE4 to create incredibly striking videos to accompany his musical tracks, which are often associated with trap, drum and bass, electro and dubstep. Now with nearly 10 million views, Crab Rave features thousands of CG crabs gathering after a tropical storm to dance it out. Noisestorm’s latest release, Breakout (feat. Foreign Beggars), depicts a tactical prison break with intense firefights, massive explosions, a high-energy helicopter chase and an amazing sniper shot.
TOOL / PLUGIN: Optim by Theia Interactive – Website
Currently in alpha, the Optim plugin applies an accelerated workflow methodology to Unreal Engine’s Datasmith suite of tools and services for enterprise markets. Leveraging the efficiency of Datasmith and the power of Python, artists and designers can use Optim to visualize and customize their Datasmith import process for further optimization.
INDEPENDENT GAME / VR: Planetrism VR by Planetrism Team – Gameplay
The future of humankind leads to the distant stars in this VR and PC adventure developed by Finnish duo Kimmo Kaunela and Mike Laaksonen. In Planetrism, players follow the opportunity of a lifetime to lead colonization on an uncharted planet, encountering untold mysteries while building a future for generations to come.
ARCHITECTURE / VR: Real Estate in Virtual Reality by REinVR – Website
The real estate technology team at REinVR is focused on using UE4 to build advanced immersive consumer buying experiences using digital humans, AI and VR.
INDEPENDENT GAME: Risk One’s Neck by Royce Games – Website
Developed by Korean indie team Royce Games for PC and consoles, Risk One’s Neck is a vintage arcade-style beat ’em up game set in a brutal, realistic urban environment. An homage to the Capcom arcade fighters of the 1980s, Risk One’s Neck channels thrilling gameplay for players of all skill levels.
FILM / CINEMA: Robots’ Design Academy by Eric Liu – Blog
A student film by Eric Liu, this 12-minute cinematic highlights the art of the possible when a single person sets out to do something wonderful. Powered by the drive and passion to create something spectacular, Eric created a wordless tale about creativity and daring to be different. It follows a robot student learning to design after most of humanity has become extinct from some unknown apocalypse. Dismayed by the institution’s insistence on strictly copying human creations perfectly, the droid protagonist sets out to design something bold and unique with the help of a newfound human pal.
LEARNING RESOURCE: Russian UE4 Lessons and Community – Website – YouTube
This incredible volunteer-driven resource for the Russian development community has been in operation since the public launch of UE4 in 2014. Featuring translations of exhaustive release notes for dozens of major engine updates, along with hundreds of localized tutorials — all created independently, and freely shared online — the group has well over 50,000 members across their networks, which also include popular Unreal Engine Discord and VK channels.
INDEPENDENT GAME: S.O.N by RedG Studios – Website
S.O.N is a modern-day psychological survival horror game in which a father searches for his son who has gone missing deep in the Pennsylvania forest, better known as South Of Nowhere. In a world where fear takes control and the past is never erased, questions linger around what demons will be faced to get back a loved ones. S.O.N is coming to PlayStation 4.
INDEPENDENT GAME: Spellbreak by Proletariat Inc. – Website
With talent from game studios such as Harmonix, Turbine and Insomniac, Proletariat is bringing a magical twist to battle royale. Currently in pre-alpha on PC, Spellbreak puts a new spin on the genre with its fantasy art style and powerful magic spells that can be explosive when used in combat.
FILM / CINEMA: The Abyss by Kemal Günel – Video
This real-time short film depicts an ominous scenario aboard a desolate spaceship. Built using Kemal’s assets that are available on the Unreal Engine Marketplace, the project is also the basis for his popular UE4 Lighting tutorial series, which has 35 videos and counting.
INDEPENDENT GAME: The Cycle by YAGER – Website
Currently in Closed Alpha, The Cycle is the latest FPS game from Berlin-based YAGER. Up to 20 players go head to head to fulfill contracts during matches about 20 minutes in length. The Cycle is planned for PC release in early 2019 with support for consoles to follow.
AR / VR: The Hydrous presents: Immerse – Website
Jason McGuigan and his team at Horizon Productions have been on the bleeding edge of XR for several years, with a library of AR and VR projects built with UE4 under their belt. A pre-release version of Immerse took the stage at the recent Trojan Horse Was a Unicorn gathering in Malta presented by Dr. Erika Woolsey, CEO of the Hydrous. The Hydrous’ mission is to create open access oceans by bringing conservation education to the masses. Horizon also presented a high-fidelity VR art gallery created in Unreal Engine that featured almost 100 paintings by some of the world’s leading digital artists.
FINE ARTS / VR: The Kremer Collection Virtual Museum – Website
Designed by architect Johan van Lierop, Founder of Architales and Principal at Studio Libeskind, the Kremer Museum features 17th Century Dutch and Flemish Old Master paintings from the Kremer Collection and is accessible through Viveport, Steam and Oculus.
TOOL / PLUGIN: Tools and Plugins by VR Professionals – Video – Website
Russia-based VR Professionals are on a mission to create more affordable and accessible “out of the box” solutions for VR training and education using UE4. Having identified a desire for UE4 apps to be more deeply integrated into enterprise ecosystems, e.g., SQL databases, analytics, reports, LMS and CRM systems, VR Professionals are developing UE4 tools and plugins to help organizations streamline their use of B2B apps faster and with lower costs.
FILM / CINEMA: Unannounced project by Kite & Lightning – Website
The recipient of the 2018 SIGGRAPH Best Real-Time Graphics and Interactivity Award at the recent Real-Time Live! showcase, Kite & Lightning wowed audiences with the presentation “Democratizing Mocap: Real-Time Full Performance Motion Capture with an iPhone X, Xsens, IKINEMA and Unreal Engine.” This Unreal Dev Grant is given in support of new breakthroughs in live performance-driven entertainment.
INDEPENDENT GAME: Unbound: Worlds Apart by Alien Pixel Studios – Steam
Unbound: Worlds Apart is an atmospheric 2D puzzle platformer in which the player can conjure magic portals to travel between different realities and learn more about a catastrophe that has ravaged his world. Inside certain portals, the physical properties of the character or world elements can change, offering new gameplay possibilities. A dark fairy tale with a cartoonish style, Unbound: Worlds Apart is planned for release on PC and consoles in 2020.
TOOL / PLUGIN: VR Cataract Training Solution by Surgical Mind – Video
Surgical Mind, a branch of Korea-based Mania Mind, is developing a cutting-edge VR simulator for cataract surgery to enable medical residents to better hone their skills before getting near an eye. Their team maintains that VR simulation training improves performance, minimizes risk and provides greater detail around potential scenarios more efficiently than expensive physical simulators.
To learn more about Unreal Dev Grants and to apply, visit: http://unrealengine.com/unrealdevgrants
Unreal Engine 4.21 continues our relentless pursuit of greater efficiency, performance, and stability for every project on any platform. We made it easier to work smarter and create faster because we want your imagination to be the only limit when using our tools. And we battle-tested the engine on every platform until it met our developers’ high standards so your project will shine once it is ready for the masses.
We are always looking for ways to streamline everyday tasks so developers can focus on creating meaningful, exciting, and engaging experiences. Our industry-leading Niagara effects toolset is now even more powerful and easier to use, enabling you to dream up the next generation of real-time visual effects. You can build multiplayer experiences on a scale not previously possible using the now production-ready Replication Graph functionality. Iterate faster thanks to optimizations with up to a 60% speed increase when cooking content, run automated tests to find issues using the new Gauntlet automation framework, and speed up your day-to-day workflows with usability improvements to the Animation system, Blueprint Visual Scripting, Sequencer, and more.
We strive to make it possible for your creations to be enjoyed as you intended by everyone, everywhere regardless of the form factor they choose. Building on the previous release, we have added even more optimizations developed for Fortnite on Android and iOS to further improve the process for developing for mobile devices. Available in Early Access, Pixel Streaming opens a whole new avenue to deploy apps in a web browser with no barrier to entry and no compromise on rendering quality. We have also improved support for Linux as well as augmented, virtual, and mixed reality devices.
In addition to all of the updates from Epic, this release includes 121 improvements submitted by the incredible community of Unreal Engine developers on GitHub! Thanks to each of these contributors to Unreal Engine 4.21:
Aaron Franke (aaronfranke), Adam Rehn (adamrehn), Adrian Courrèges (acourreges), aladenberger, Alan Liu (PicaroonX), Cengiz Terzibas (yaakuro), Cerwi, Chris Conway (Koderz), cmp-, Damianno19, Deep Silver Dambuster Studios (DSDambuster), Dorgon Chang (dorgonman), DSCriErr, Dzuelu, Eliot (BillEliot), Erik Dubbelboer (erikdubbelboer), fieldsJacksonG, Franco Pulido (francoap), Frank Stähr (phisigma), George Erfesoglou (nonlin), Hao Wang (haowang1013), Henri Hyyryläinen (hhyyrylainen), Homer D Xing (homerhsing), IlinAleksey, Jacob Nelson (JacobNelsonGames), Jerry Richards (r2d2Proton), Jesse Fish (erebuswolf), Josef Gluyas (Josef-CL), Joshua Harrington (thejhnz), Kalle Hämäläinen (kallehamalainen), KelbyG, Layla (aylaylay), LizardThief, Lucas Wall (lucaswall), Mai Lavelle (maiself), malavon, Marat Radchenko (slonopotamus), Marat Yakupov (moadib), Marco Antonio Alvarez (surakin), Markus Breyer (pluranium), marshal-it, Martin Gerhardy (mgerhardy), Mathias Hübscher (user37337), Michael Kösel (TheCodez), Morva Kristóf (KristofMorva), Muhammad A. Moniem (mamoniem), Nick Edwards (nedwardsnae), nrd2001, Oliver (oliverzx), phoenxin, projectgheist, Rene Rivera (grafikrobot), Rick Yorgason (Skrapion), Riley Labrecque (rlabrecque), Sahil Dhanju (Vatyx), Sam Bonifacio (Acren), scahp, Sébastien Rombauts (SRombauts), Tom Kneiphof (tomix1024), Troy Tavis (ttavis), Truthkey, UristMcRainmaker, Wiktor Lawski (wlawski), yhase7, Zeno Ahn (zenoengine)
Niagara Platform Support and Usability Improvements
In our continuing effort to provide industry-leading effects tools, Niagara has received an expanded feature set, substantial quality of life improvements, and Niagara effects are now supported on Nintendo Switch!
GPU-Only Texture Sampling in Niagara
You can now sample a 2D texture or a pseudo-volume 2D texture in your particle scripts! Create amazing effects such as rendering the scene’s depth, color and normal information using a Scene Capture Actor and use that to reconstruct the environment within a Niagara particle system with the particles’ potential and kinetic energy visualized as emissive light.
Check out the Niagara level in the Content Examples project to see how this feature works!
Niagara Skeletal Mesh Data Interface Improvements
There are new functions you can use in the Skeletal Mesh Data Interface enabling direct sampling of a Skeletal Mesh’s vertex data as well as access to specific Bones or Sockets on the Skeletal Mesh.
Ribbon Particle Performance Improvements
Ribbons now generate the ribbon geometry on the GPU instead of the CPU, improving overall performance.
GPU Simulation Support in Niagara
GPU simulation of Niagara effects is now supported on all non-mobile platforms.
Simplified System and Emitter Creation
Niagara now includes friendly dialogs that make creating systems and emitters easier than ever! You can create new emitters and systems from a curated set of templates to speed up development and ensure best practices.
This constraint solves with physics forces, optional spring drivers, and includes potential energy calculation. You can now create exciting, dynamic effects such as spawning particles with inherited velocity if the energy exceeds a specified threshold:
Module Additions and Improvements
- Generate and receive death events
- Now factoring mass into multiple modules
- New SampleSkeletalMeshSkeleton, SampleSkeletalMeshSurface, SkeletalMeshSkeletonLocation and SkeletalMeshSurfaceLocation modules to complement enhancements to the Skeletal Mesh Data Interface
- New AddVelocityInCone module
- New Force modules: FindKineticAndPotentialEnergy, GravityForce, SpringForce and multiple usability tweaks to other forces
- New KillParticlesInVolume module
- New SpriteRotatationRate module
- New RecreateCameraProjection module for using render targets and camera transforms to turn scene captures into deformable particle systems
- New modules for sampling textures: SamplePseudoVolumeTexture, SampleTexture, SubUV_TextureSample, and WorldAlignedTextureSample
- New utility modules for temporal interpolation and frame counters
- Many new dynamic inputs and functions
New: Replication Graph
The Replication Graph Plugin makes it possible to customize network replication in order to build large-scale multiplayer games that would not be viable with traditional replication strategies. For example, Epic’s own Fortnite Battle Royale starts each match with 100 players and roughly 50,000 replicated actors. If each replicated actor were to determine whether or not it should update across each client connection, the impact on the server’s CPU performance would be prohibitive.
The Replication Graph Plugin solves this problem by offering an alternate strategy geared specifically for high-volume multiplayer games. This works by assigning actors to Replication Nodes, which store precalculated information that clients can use to retrieve lists of actors that need to be updated, saving the CPU of recalculating the same data for many clients on every frame. In addition to the standard nodes that ship with the Engine, developers can write their own nodes to fit the specific needs of actors within their games.
New: Optimizations for Shipping on Mobile Platforms
The mobile development process gets even better thanks to all of the mobile optimizations that were developed for Fortnite’s initial release on Android, in addition to all of the iOS improvements from our ongoing updates!
Improved Vulkan Support on Android
With the help of Samsung, Unreal Engine 4.21 includes all of the Vulkan engineering and optimization work that was done to help ship Fortnite on the Samsung Galaxy Note 9 and is 100% feature compatible with OpenGL ES 3.1. Projects that utilize Vulkan can run up to 20% faster than the same project that uses OpenGL ES.
Config Rules System for Android
The Android Config Rules system can now be used to help catch issues very early in a project start up process. This tool allows for quickly checking for device support and providing either a warning or error dialog to the user if there are issues discovered, such as an out of date driver or unsupported GPU. Any variables set may be queried later in C++ with FAndroidMisc::GetConfigRulesVariable(TEXT(“variablename”)).
To use this system, a configrules.txt rules file is optionally placed in your project’s Build/Android directory and UPL is used to add a Gradle task to use the ConfigRulesTool to compress and optionally encrypt it during packaging the APK. More details can be found in the engine documentation.
Program Binary Cache for Android
The Program Binary cache can be used to help improve Shader loading performance and also reduce hitching due to Shader loading on Android devices. The Program Binary cache works by generating optimized binary representations of Shader programs on the device which are used when loading shaders during subsequent runs. Loading Shader programs from optimized binaries can also dramatically decrease Shader loading times. The Program Binary cache must be used in conjunction with the Shader Pipeline cache tool as it will populate the Program Binary cache during the initial run of your application. To enable the Program Binary cache in your project, you will need to add the following command to your AndroidEngine.ini or Device Profile.
Note: Some devices do not support the required program binary extension, such devices will fallback to the previous behavior.
Emulated Uniform Buffers on Android
You can now use Emulated Uniform Buffers for projects that target the OpenGL ES3.1 feature level, significantly reducing memory usage and improving rendering performance depending on your application complexity. Emulated Uniform Buffers have also been optimized to help reduce the size of the data that needs to be transferred to the GPU when your project is being packaged. To enable Emulated Uniform Buffers when using the OpenGL ES3.1 feature level, add the following line to your project’s
CPU Thread Affinity Control on Android
The ConfigRules system can register whether or not to use a supplied little core affinity mask. If enabled, the following threads will use little cores which improves battery life and evens out performance since they won’t switch between big and little cores causing possible hitches: render, pool, taskgraph, stats, taskgraph background, async loading. For details on how to set this up, see the Config Rules documentation.
Improved GPU Particle Simulation Performance on Mobile
Mobile particle effects that utilize the GPU for particle simulation have been significantly improved. You now have the option of reducing the memory usage for GPU particle simulation by limiting the maximum number of simulated particles that can be used. By default the maximum number of GPU particles that can be simultaneously simulated is set to around one million particles which will use around 32 MB of memory. You can adjust the maximum number of particles to use by adding the following code to your project’s
[/Script/Engine.RendererSettings] fx.GPUSimulationTextureSizeX=512 fx.GPUSimulationTextureSizeY=512
- Setting the value from 512 to 256 will reduce the memory footprint to around 8 MB.
- The SimulationTextureSize size has to be a power of 2.
- These improvements are especially apparent on devices that use the ARM Mali GPU.
Dithered LOD Transitions
Dithered LOD transitions are now supported on mobile platforms. When enabled, objects with Materials that have Dithered LOD transitions option enabled will now fade from one Level Of Detail (LOD) to another in an almost seamless manner. By default support for Dithered LOD transitions is disabled for mobile platforms. To enable it, go to Project Settings > Rendering > Mobile and then check the Allow Dithered LOD Transitions option.
Note: Materials that have Dithered LOD transitions enabled will be rendered as Masked Materials. This could have a negative performance impact on mobile platforms. We recommend enabling this effect only on Masked Materials.
New: Cooker Performance
The cooking process has been optimized resulting in up to 60% reductions in cook times! Low-level code now avoids performing unnecessary file system operations, and cooker timers have been streamlined. Handling of unsolicited Assets (with regard to Asset dependencies) has also been refactored to scale better. These changes are most pronounced on larger projects (projects with more than 100,000 Assets).
New: Pixel Streaming (Early Access)
Run a packaged Unreal Engine application on a desktop PC in the cloud, and stream the viewport directly to any modern web browser on any platform! Get the highest-quality rendering in lightweight apps, even on mobile devices, with zero download, zero install.
A viewport rendered by Unreal Engine, embedded within a web UI. Images and models courtesy of McLaren.
You can broadcast a single game session to multiple viewers by simply sharing a link, or send each connecting user to their own separate game session.
For details, see Pixel Streaming .
New: Animation System Optimizations and Improvements
The Animation System continues to build on its best-in-class features thanks to new workflow improvements, better surfacing of information, new tools, and more!
Animation Compression Updates
Animation Compression times are significantly reduced by using a whitelist of optimal codecs to avoid trying permutations that are unlikely to be selected which greatly reduces the number of codecs we attempt to compress with. On multicore systems, most of the codecs now evaluate in parallel during automatic compression, further reducing the time it takes to compress an animation sequence.
The following updates were made to the Animation Compression Stat Dialog window:
- Fixed bugs that would cause dialog to show incorrect results
- Added compression time stat
- Added number of compressed animations
- Added tracking for animation with largest average error
- Added tracking of worst 10 items instead of just worse
- Better labeling on dialog
- Pass through FBoneData array more instead of recalculating
Please see Compression for more information.
Animation Notify Improvements
New Animation Notifies have been added that enable you to manage the state of dynamics and cloth simulations. We have also updated Notify add/replace menus to use class pickers for better searching of BP and native notifies. To add a Notify, right-click on a Notifies track, then under Add Notify, select the type of Notify you wish to add.
Please see Animation Notifications (Notifies) for more information.
Maintain Original Scale of Root Motion
Added Use Normalized Root Motion Scale option to maintain the original scale of Root Motion. This option is on by default and is the functionality that existed prior to this Engine release. Choosing to disable this option will now use the final blended animation instead.
Please see Enabling Root Motion for more information.
Added Caching and Autocomplete for “Sync Marker” Names
When creating Sync Markers, you can now access any Sync Markers assigned to the Skeleton from the Existing Sync Markers menu option. Entering text into the search box will also filter out Sync Markers based on your text entry.
Animation Sequence Framerate
The framerate of Animation Sequences is now displayed in the Animation Tools viewport and Content Browser tooltip.
Enable Auto Blend Out on Anim Montages
Anim Montages now have the option to enable or disable Auto Blend Out. This option is enabled by default, however you can disable it which won’t auto blend out the Montage and will keep the last pose.
Please see Montage Properties for more information.
CCDIK Skeletal Control Node
Use the new CCDIK (Cyclic Coordinate Descent Inverse Kinematics) Skeletal Control Node for a lightweight, IK algorithm suited for real-time calculation of relatively short IK chains, such as shoulder to fingertip.
Please see CCDIK Skeletal Control Node for more information.
Set Master Pose Component Force Update
The Set Master Pose Component function has a second input pin called Force Update that can be used to skip updating all runtime info if that info is the same as the Master Component or force the updating of the runtime info. This only applies to the registration process as that can be serialized, at which point it will need to refresh all runtime data.
Please see Master Pose Component for more information.
Miscellaneous Improvements and Updates
- Live Animation Blueprint Recompilation is now non-experimental
- Local Space is now the default Coordinate Space for Animation Editors
- A notification is now displayed in the Animation Tools viewport when a min LOD is being applied.
New: Gauntlet Automation Framework (Early Access)
The new early access Gauntlet automation framework enables you to automate the process of deploying builds to devices, running one or more clients and or/servers, and processing the results.
You can create Gauntlet scripts that automatically profile points of interest, validate gameplay logic, check return values from backend APIs, and more! Gauntlet has been battle tested for months in the process of optimizing Fortnite and is a key part of ensuring it runs smoothly on all platforms.
Gauntlet provides new profiling helpers that can record critical performance values between two points in time in order to track missed-Vsyncs, hitches, CPU Time, GPU Time, RHI Time, Draw Calls and more. Gauntlet also provides helper functions to gather these from logs so you can generate warnings, store them in databases, or create trend lines. All of the info captured during the test is available to be output into reports any way you want.
An example of a simple report is shown below:
Each Gauntlet test is a C# script that expresses a simple configuration for your test – how many clients, how many servers, and what parameters to pass. Gauntlet takes care of allocating machines from a pool, deploying and running builds, checking for common errors such as crashes, asserts, or timeouts, and collecting log files and other artifacts.
New: Submix Envelope Follower
Users of the new Unreal Audio Engine can now set an Envelope Follower Delegate on their Submixes allowing amplitude analysis of individual channels for that submix. This will help users power visualizations and Blueprint Events based on the amplitude characteristics of their Submixed audio.
New: Filter Sound Submix Effect
Users of the new Unreal Audio Engine now have the option of adding a multimode filter to their Submixes allowing dynamic filter effects on a single Submix.
New: Sound Submix Effect Reverb Dry Level
The Submix Effect Reverb in the new Unreal Audio Engine now supports Parallel Wet and Dry Levels allowing users to dial in specific Wet/Dry ratios making the Effect usable as an Insert-Style Effect as well as a Send-Style Effect.
New: Optimizations to Source Effect API
The Source Effect API in the new Unreal Audio Engine has been optimized to process a full buffer of audio rather than frame-by-frame. This will allow Source Effects to process more efficiently than before.
New: Linux Defaults to Vulkan Renderer
Linux now uses Vulkan as the default renderer when available. In the event the API cannot be initialized, the Engine will fall back OpenGL without notification.
From the Project Settings, you can use the Target RHIs to add or disable a particular RHI or use command line switches -vulkan and -opengl4 to disable the fallback.
New: Linux Media Player
You can now use the bundled WebMMedia plugin to play back .webm VPX8/9 videos on Linux platforms.
New: Linux Crash Report Client GUI
We’ve added support for the Crash Reporter GUI on Linux so you can help us continue to improve support for Linux platforms. Please submit reports when they occur, even repeated ones! It helps our engineers assess the frequency and learn what circumstances cause the crash to happen.
New: Professional Video I/O Improvements (Early Access)
We continue to make it easier to get video feeds into and out of the Unreal Editor over professional quality SDI video cards. You can now work with the same Unreal Engine Project across multiple computers with different hardware setups, without changing any configuration settings in the Project.
Create a MediaProfile on each machine, and set it up to handle the video card and formats that you need to use on that computer. You can also override the Project’s timecode and genlock sources from the same panel:
When you combine the Media Profile with the new Proxy Media Source and Proxy Media Output Asset types, you can automatically redirect input and output channels between the Project’s media content and the settings in your Media Profile. When you switch to a different Media Profile — for example, on a different computer with a different media card or different wiring setup — the input and output channels from that machine’s hardware are automatically routed through the proxies so that you don’t have to change any content in your Project.
For details, see Using Media Profiles and Proxies .
In addition, this release adds:
- A dockable Timecode Provider panel (Window > Developer Tools > Timecode Provider) that shows the Unreal Engine’s current timecode and the source that timecode is coming from:
- Support for 10-bit input, audio I/O and interlaced/PsF inputs.
- A new Blackmagic Media Player Plugin that supports SDI cards from Blackmagic Design. See the Blackmagic Video I/O Quick Start .
Note: The AJA Media Player and Blackmagic Media Player Plugins are now available through the Marketplace tab in the Epic Games Launcher, instead of being installed automatically with the Unreal Engine. Their source is freely available on GitHub , to give other developers a model of how to develop video I/O plugins on top of the Engine’s Media Core APIs.
New: Geographically Accurate Sun Positioning (Early Access)
In the real world, the sun’s position in the sky depends on the latitude and longitude of the viewer, the date, and the time of day. You can now use the same mathematical equations to govern the sun’s position in your Unreal Engine Level.
This is particularly effective any time you need to simulate the real-world lighting conditions for a specific place on the Earth, such as for a major architectural or construction project. However, this can also be useful for any Level that you want to exhibit realistic sun movements and positioning based on global position and time of day.
For details, see Geographically Accurate Sun Positioning .
New: Static Mesh Processing
We have added several new Static Mesh processing options inside the Unreal Editor. You can now save memory by removing unnecessary UV mappings from your Static Meshes.
In addition, using Python and Blueprint scripts that you run inside the Unreal Editor, you can now:
- Create UV mappings with planar, box, and cylindrical projections. See Working with UV Channels .
- Run the Proxy Geometry tool to merge and simplify groups of Static Meshes in a Level. See Using the Proxy Geometry Tool in Blueprints and Python .
- Reuse an LOD from one Static Mesh as an LOD for another Static Mesh. See Creating Levels of Detail in Blueprints and Python .
New: Blueprint Usability Improvements
The Blueprint Graph editor now features “Quick Jump” navigation enhancing the existing bookmark feature by enabling users to save their current location and zoom level in the Blueprint Editor with CTRL + [0-9]. They can then quickly return to that graph at that location and zoom level by pressing SHIFT + [0-9] whenever the Blueprint editor window is open, even when working in a different Blueprint Asset. “Quick Jump” bookmarks persist across Editor sessions, and are local to the user/machine.
Users now have the ability to insert pins before or after a target pin for Sequence nodes via the context menu, rather than only being able to add them onto the end.
Monolithic engine header file exclusion from nativized Blueprint class C++ code is now available as a Project Setting. This can help to reduce the overall code size of the monolithic game EXE file, if code size is an issue. The option can be found at Project Settings->Packaging in the “Advanced” section under the “Blueprint Nativization Method” option. This option is disabled by default to maintain compatibility with existing objects.
New: Improvements to HTML5 Templates
The build process will automatically pick the project’s path or otherwise fallback to the Engine’s version.
This is based on GitHub PR#4780.
New: HTML5 README Files Updated
The HTML5 README file has been split up into multiple README files based on category:
- Building UE4 HTML5
- Get Source Files
- Compiling Support Programs
- Compiling UE4 Editor
- Run UE4 Editor
- Package The Project For HTML5
- Test The HTML5 Packaged Project
- Debugging UE4 HTML5
- How To Dump The Stack And Print From Cpp
- BugHunting GLSL
- Emscripten and UE4
- Emscripten toolchain and Thirdparty libraries
- UE4 C# scripts
- Test Build, Checking In, and CIS
New: Improved IPv6 Support
Support for IPv4 and IPv6 has been merged into a single socket subsystem, where previously support for each protocol was isolated to a specific subsystem. This allows platforms that used one of the BSD subsystems to support both IPv4 and IPv6 at the same time, and do it transparently to the calling code.
New: DDoS Detection and Mitigation
DDoS (distributed denial of service) attacks typically hinder game servers by flooding them with so many packets, that they are unable to process all of the packets without locking up and/or drowning out other players’ packets, causing players to time out or to suffer severe packet loss which hinders gameplay.
Typically these attacks use spoofed UDP packets, where the source IP is unverifiable. This optional DDoS detection focuses specifically on this situation, detecting/mitigating DDoS attacks based on configurable thresholds of spoofed UDP packets, which do not originate from an existing, known client connection. This is not a guarantee that servers will be safe from all attacks, since it’s still possible that a heavy attack can overwhelm the hardware or OS running the server.
New: Physics Interface Updates
The Physics Interface has been refactored to support an increased ownership of physics objects at the high level. As a consequence of these changes, we have deprecated the Async Scene which was only recommended for use with APEX Destruction. You can still achieve the same visual results using the Sync Scene.
As a result of these changes, much of the physics related C++ code API has changed. Functionally the API is the same and you should be able to use it very similarly to how you currently use it. We’ve made changes to the Physics Interface with the goal of a) reorganizing dependencies into one controlled place, and b) creating a common model for physics interactions when interacting with Unreal.
Please see 4.21 Physics Technical Notes for more information.
New: Pipeline State Object (PSO) Caching
We now support Pipeline State Object (PSO) Caching on Metal (iOS/Mac), DX12 and Vulkan platforms. PSO caching helps reduce any hitches your project might encounter when a Material requires a new Shader to be compiled. PSO Caching creates a list of all the needed Shaders that are required by the Materials which is then used to help speed up the compiling process of these Shaders when they are first encountered by your project. PSO Caching can be enabled in the Project Settings > Packaging section.
To find out more information about how to setup and use PSO caching in your UE4 project, make sure to check out the PSO Caching documents.
New: Physical Lighting Units Updates
We have improved the workflow and usability for Physical Lighting Units based on feedback provided by the community. As part of these updates, the following changes have been made:
- All light types now display their units type next to the Intensity value.
- Directional Lights are now displayed in LUX with increased intensity range.
- Sky Light intensity is now displayed in cd/m2 with increased intensity range.
- Post Process Auto-Exposure settings can be expressed in EV-100 for an extended range of scene luminance. This can be enabled via Project Settings.
- The Pixel Inspector can now display pre-exposure for Scene Color. This can be enabled via Project Settings.
- HDR (Eye Adaptation) Visualization has been refactored in the following ways:
- HDR Analysis picture-in-picture display over the current scene view allowing adjustments with instant feedback.
- Visualization is now expressed in EV100.
- Pixel Inspector-like feedback has been removed.
For additional information, see Physical Lighting Units .
New: Sequencer Event Track
The Sequencer Event Track has been completely refactored so that Events are now more tightly coupled to Blueprint graphs which makes it a much more familiar user-experience and more robust. By utilizing Blueprints and Interfaces, this offers better control and stability compared to the previous implementation which used struct payloads and anonymous named events.
New: Geometry Cache Track (Experimental)
The new (and experimental) Geometry Cache Track allows you to scrub through a Geometry Cache and render it out with frame accuracy.
Please see Using the Geometry Cache Track for more information.
New: Sequencer Audio Bakedown (Early Access)
You can now bake down the audio into a Master Audio Submix from the Render Movie Settings window. The process of baking audio occurs in a separate render pass and exports the audio in the sequence to a single file when you render a movie.
Please see Render Movie Settings for more information.
New: Sequencer Guide Marks
You can now lay down vertical guide marks on the timeline to use for snapping or identifying key points in your timeline.
Please see Using Frame Markers in Sequencer for more information.
New: Windows Mixed Reality Support
Unreal Engine 4 now natively supports the Windows Mixed Reality (WMR) platform and headsets, such as the HP Mixed Reality headset and the Samsung HMD Odyssey headset. To use our native WMR support, you must be on the April 2018 Windows 10 update or later, and have a supported headset. For more information on how to get up and running, see Windows Mixed Reality Development .
Image courtesy of HP
New: Magic Leap Qualified Developer Release Support
Unreal Engine 4 now supports all the features needed to develop complete applications on Magic Leap’s Lumin-based devices. We support rendering, controller support, gesture recognition, audio input/output, media, and more. For more information on how to be become a developer, please check out https://www.magicleap.com/ .
New: Oculus Avatars
The Oculus Avatar SDK includes an Unreal package to assist developers in implementing first-person hand presence for the Rift and Touch controllers. The package includes avatar hand and body assets that are viewable by other users in social applications. The first-person hand models and third-person hand and body models supported by the Avatar SDK automatically pull the avatar configuration choices the user has made in Oculus Home to provide a consistent sense of identity across applications. For more information, see the Avatar SDK Developer Guide .
New: Round Robin Occlusions
Unreal Engine 4 now supports Round Robin Occlusions. With the newly added vr.RoundRobinOcclusion flag enabled, stereoscopic frames will kick off occlusion queries for one eye per frame using an alternating scheme (i.e. odd frames only kick off queries for the left eye, and even frames only kick off queries for the right). This approach cuts the number of occlusion draw calls per frame by half. In some situations, this improves performance significantly.
New: Platform SDK Upgrades
In every release, we update the Engine to support the latest SDK releases from platform partners.
- IDE Version the Build farm compiles against
- Visual Studio: Visual Studio 2017 v15.6.3 toolchain (14.13.26128) and Windows 10 SDK (10.0.12699.0)
- Minimum supported versions
- Visual Studio 2017 v15.6
- Visual Studio 2015 Update 3
- Minimum supported versions
- Xcode: Xcode 9.4
- Visual Studio: Visual Studio 2017 v15.6.3 toolchain (14.13.26128) and Windows 10 SDK (10.0.12699.0)
- Android NDK r14b (New CodeWorks for Android 1r7u1 installer will replace previous CodeWorks on Windows and Mac; Linux will use 1r6u1 plus modifications)
- HTML5: Emscripten 1.37.19
- Linux “SDK” (cross-toolchain):
- Lumin: 0.16.0
- Steam: 1.39
- SteamVR: 1.39
- Oculus Runtime: 1.28
- SDK 5.3.0 + optional NEX 4.4.2 (Firmware 5.0.0-4.0)
- SDK 6.4.0 + optional NEX 4.6.2 (Firmware 6.0.0-5.0)
- Supported IDE: Visual Studio 2017, Visual Studio 2015
- Firmware Version 6.008.021
- Supported IDE: Visual Studio 2017, Visual Studio 2015
- Xbox One:
- XDK: June 2018 QFE-4
- Firmware Version: June 2018 (version 10.0.17134.4056)
- Supported IDE: Visual Studio 2017
- macOS: SDK 10.14
- iOS: SDK 12
- tvOS: SDK 12
At Epic Games, we believe that the opportunity for architects, manufacturers, media and entertainment companies, and designers to transform their businesses through real-time technology is huge. To get a reading on the pulse, and to see where things ma…
By Jason Crawford, Founder & CEO of Modal Systems, Inc. There is a rising tide of virtual reality in location-based entertainment (LBE). It’s a much needed reset for the industry and Modal hopes to see this market grow. Competition will come later. For now, location-based virtual reality (LBVR) companies need to build this new market together. Thus, I’d like to
Autodesk University is back for another exciting year! Planning on attending? We’d love to have you join us to experience the future of interactive design with the latest real-time technology.
AU 2018 is from November 13 to 15 at the Venetian Hotel i…