Ncam helps deliver seamless integration of live action and real-time 3D graphics in UE4

Although it can be easy to combine live action with real-time game engine elements in the Unreal Engine, more data is often needed for complex virtual production. For high-end complex projects, it is important to not only layer the live action with keying, but to have access to detailed information about the exact camera position, lens information, and a depth map of the screen. 
Ncam offers a complete and customizable platform that enables virtual sets and graphics in real-time. At its core is their unique camera tracking solution that offers virtual and augmented graphic technology, using a special camera hardware add-on and complex software. The system uses a lightweight sensor bar attached to a camera to track natural features in the environment, allowing the camera to move freely in all locations while generating a continuous stream of extremely precise positional, rotational, and lens information. This feeds into a UE4 plugin via Ncam’s powerful SDK.
The system can be used on any type of production, from indoor or outdoor use, to mounted wire rigs or hand-held camera configurations. Ncam’s products are used worldwide and have been used in the production of Aquaman (Warner Bros.), Solo: A Star Wars Story (Walt Disney Studios), Deadpool 2 (Marvel), Game of Thrones Season 8 (HBO), Super Bowl LIII (CBS), UEFA Champions League (BT Sport), NFC Championship Game (Fox Sports), and Monday Night Football (ESPN). 

Specialized hardware for accurate tracking

At its core, Ncam relies on a camera-mounted specialized piece of hardware. This small lightweight sensor bar combines a suite of sensors. Most visible are the two stereo computer vision cameras. Not so obvious are the 12 additional sensors inside the Ncam hardware bar. These include accelerometers and gyroscopes, which together with the stereo camera pair make Ncam able to fully see the set in spatial depth, with a real-time 3D point cloud. 
The same hardware unit also interfaces with the various controls on the lens, such as a Preston follow-focus control, which means Ncam knows where the lens is, what it is looking at, what the focus and field of view are, and importantly, where everything in front of the lens is located. The props, set, actors, camera, and lensing are all mapped and understood in real time. It is an extraordinary way to allow a UE4 engine to be aware of the real world and integrate live graphical elements, characters, and sets into one seamless production, while you watch.

Gathering data for predictive movement
    
Ncam from the outset is relying on a fusion of different techniques including visual tracking, odometry, and inertial navigation system technology to solve the problem of camera tracking. However, in addition to gathering data, Ncam also provides insightful information. The software uses this data to do predictive movement and have robust redundancy. It knows where the camera was and where it thinks it is going. The software handles any loss of useful signal from the cameras. If the actor blocks one of the stereo lenses, or even both, the system will continue uninterrupted based on the remaining aggregate sensor data. 

The software integrates all this data into one useful input to UE4. For example, while the computer vision cameras could run at up to 120 fps, the other sensors run at 250 fps, and so all the various data is retimed and normalized into one coherent, stable output which is clocked to the timecode of the primary production camera. 

Some sets have very challenging lighting and Ncam has an option to run the cameras in infrared mode, for strobing or flashing-light scenes. The system is also designed to have low latency, so a camera operator can watch the composited output of the live action and the UE4 graphics as a combined shot, for much more accurate framing and blocking. It is much easier to line up the shot of the knight and the dragon, if you can see the whole scene and not just a guy in armor alone on a green soundstage.

Precise lens calibration and matching

The camera tracking resolves to six degrees of freedom: XYZ position and then three degrees of rotation. Added to this is the production camera’s lens data. In addition to focus, iris, and zoom, Ncam has to know the correct lens curvature or distortion during all possible zoom, focus, and iris adjustments for the UE4 Graphics to match perfectly together with the live action. Any wide lens clearly bends the image, producing curved lines that would otherwise be straight in the real world. All the real-time graphics have to match this frame by frame, so the lens properties are mapped on a lens serial number basis. Every lens is different, so while a production may start with a template of say a Cooke 32mm S4/i lens, Ncam provides a system for lens calibration that compensates for individual variations. 

Ncam is compatible with systems such as Arri’s Lens Data System (LDS), but those systems typically don’t give image distortion over the entire optical range of the lens. At the start of any project, productions can calibrate their own lenses with Ncam’s proprietary system of charts and tools to map the distortion and pincushioning of their lens, and then just reference them by serial number.  

In the end, the system produces stable, smooth, accurate information that can perfectly align real-time graphics with live-action material. Ncam founder Nic Hatch explains, “We spent a lot of time working to fuse the various technologies of all those different sensors, I guess that’s sort of our secret sauce and why it works so well.” 

Integrating CG elements with Real Depth

The other huge benefit of Ncam is depth understanding. When elements are combined in UE4, the engine knows where the live action is relative to the UE4 camera, thanks to Ncam’s “Real Depth”. This allows someone to be filmed walking in front or behind UE4 graphical elements or virtual sets. Without the depth information, any video can only sit like a flat card in UE4. With Ncam, as the actor walks forward on set, they walk forward in UE4, passing objects all at the correct distance. This adds enormous production value and integrates the live action and real-time graphics in a dramatically more believable way. This one feature completely changes Ncam’s use in motion graphics, explanatory news sequences, and narrative sequences.
“Game engine integration has always been very important to us,” says Hatch. “At a tradeshow in 2016 we showed I think the first prototype of this kind of live action integrated with the Unreal Engine, so we have a pretty close relationship,” The company has doubled staff in the last year and the biggest proportion of Ncam’s staff are involved in R&D. A key part of their development effort is building APIs and links into software, such as UE4, for the most efficient virtual production pipeline. “The complexity of what we are doing requires a large amount of R&D,” he adds.

Advanced real-world lighting matching with Real Light

While the primary focus has been on Ncam understanding the space in front of the camera and what the camera is doing, the company also has an advanced tool to understand the lighting of the scene. Their “Real Light” project allows for a live light probe to be in the scene and inform the UE4 engine of the changing light levels and directions. 

Real Light is designed to solve the challenge of making virtual production assets look like they are part of the real-world scene. Real Light captures real-world lighting in terms of direction, color, intensity, and HDR maps, allowing the Unreal engine to adapt to each and every lighting change. Importantly, it also understands depth and position of the light sources in the scene, so the two worlds interact correctly. This means that the digital assets can fit technically and look correctly lit, which is a major advance in live action game asset integration. 

Interested in finding out about more new technology, techniques, and best practises that are changing the game for on-set production? Head on over to our Virtual Production hub, or check out our other posts relating to broadcast.
 

Virtual Production: How game engines are taking previs to a new level

Shooting live-action movies is costly. Having to reshoot them because the story doesn’t quite work the first time can be catastrophic to budgets and schedules. Not reshooting them and putting up with a final product that doesn’t live up to expectations…

Buddy VR Pioneers A New Genre of Interactive Animation

When it comes to using animations for marketing and brand engagement, many VR film projects currently on the market focus on providing an immersive one-off experience to captivate viewers in the moment. Rather than a mere afterthought, replayability is an essential ingredient for global VFX and animation studio Redrover, who is exploring fresh ways to engage viewers on a deeper level by combining story, gameplay, and greater interactivity.
  
Buddy VR – the team’s recent VR film spinoff of its Hollywood animated blockbuster, The Nut Job – recently took home the Best VR Experience Award at the Venice International Film Festival this fall. The project is part of Redrover’s vision to create a new genre of interactive animation, and what makes Buddy VR especially unique is the way it bridges the gap between animated short films and video game experiences.
 

A virtual interactive friendship

Starring “Buddy,” the loveable blue rat from The Nut Job, this vibrant interactive animation short has you meeting and befriending the little critter in a whimsical story that balances plot and gameplay elements. “We wanted to lead the story through intimacy between the player and character,” explains Chuck Chae, Director for Buddy VR. 

Players get to know Buddy through a series of non-verbal interactions like exchanging names, petting, playing musical instruments, and more. It’s a humorous, heartwarming 16-minute interactive experience, and the response from those who have played it is overwhelmingly positive, he adds.

“Simply experiencing VR offers the player an extraordinary experience, and provides deep immersion while wearing VR equipment. However, many VR titles on the market are difficult to enjoy again once they have been played through the first time,” says Chae. “Our goal is to break away from this approach and produce titles that can maintain their replayability throughout lengthy and multiple playthroughs by combining Redrover’s IP and VR technology with interactive elements.”

Optimizing creative potential with Unreal Engine

For this project, overcoming the challenge of creating cohesive story interaction through speechless communication required that the team weave in extra layers of detail and nuance to Buddy’s facial expressions, physical actions, and eye movements. Using Unreal Engine gave the team the tools and additional programming flexibility to craft a level of real-time interactivity and realism that could foster a believable relationship-building experience between players and the furry protagonist, says Chae.

“High-quality graphics and animations are essential for creating speechless interaction, which is the highlight of our product. It was amazing enough that Unreal Engine easily fulfilled our graphical standards, but it also had unbelievable real-time functionalities, allowing us to apply desired animations, edit unnatural or incorrect aspects, and then reapply to view the results all in one sitting,” says Chae, adding that the team was able to minimize production time using real-time rendering.

Optimizing their production workflows using real-time rendering also helped free up more of the team’s time and energy for creativity. “The greatest strengths of Unreal Engine are the ability to quickly make a prototype using codeless Blueprints and the ability to create high-quality graphic productions through real-time rendering,” he says. “By minimizing the workflow of realizing the designs and animations in your head to an actual render, there can be more time to concentrate on the creative aspects.” 

Ready to get started with Unreal Engine and Unreal Studio to enhance your creativity today? Download them for free right here.

Upgrading Quality within a Fast Turnaround: “ICI Laflaque” Gets an Unreal Facelift

Vox Populi Productions, the team behind award-winning political satire show ICI Laflaque, is no stranger to breakneck speeds in production. The half-hour animated show, much of it newly produced each week based on recent news, has had a 7-day turnaroun…

Unreal Engine Wins Technology & Engineering Emmy® for Animation Production

The National Academy of Television Arts and Sciences has awarded Epic Games with the first Technology and Engineering Emmy for Unreal Engine in the 2017-2018 category, “3D Engine Software for the Production of Animation.” We couldn’t be more thrilled w…

Virtual Production: Digital Domain Talks Digital Humans and More

Imagine a film set where completely photorealistic digital human avatars mirror the precise actions and expressions of the live actors playing their roles—all in a virtual space and in real time, at up to 90 fps. For directors, producers, and actors al…

Epic Games Announces over $800K in Unreal Dev Grants

Today Epic Games announced the latest recipients of Unreal Dev Grants, a $5 million fund supporting developers working with Unreal Engine 4 (UE4). This new round awards $800,000 to more than 30 individuals and teams, with no restrictions or obligations to Epic Games. As with previous rounds, these recipients illustrate the wide variety of use cases for UE4, including independent games, interactive visualizations, virtual reality surgical simulators and online learning resources.

“The Unreal Dev Grants program has a simple goal: to help talented developers succeed by letting them focus more on their project and less on their bills,” said Chance Ivey, Partnership Manager at Epic Games. “We’re continually amazed by the range of applications built with UE4 and the potential of so many of these projects; this round includes standouts such as Sojourn by Tierceron, Crab Rave by Noisestorm, and VR Cataract Training Solution by Surgical Mind. Congrats to all of these folks for their vision and persistence!”

The latest round of Unreal Dev Grants recipients includes:

FILM / CINEMA: 100 Flowers of God (working title) by 3rd World StudiosWebsite 
3rd World Studios is the Pakistan-based creator of the first animated feature-length film rendered entirely in UE4, Allahyar and the Legend of Markhor, which was released in February to critical acclaim. This Unreal Dev Grant is meant to accelerate 3rd World’s future film projects.

TOOL / PLUGIN: Anomotion Motion Composer and Anomotion BIKWebsite
Anomotion maintains two animation solutions for UE4: Motion Composer, a task-based motion planner which automatically generates precise motion sequences from unstructured animation data; and BIK, an inverse-kinematics system that can model various joint types and define custom constraints for VR avatars, virtual humans and creatures. Anomotion’s solutions have practical applications, from film previs to architectural visualizations. For industrial simulation and shared virtual environments, for example, Anomotion’s technology can be used to populate interactive, adaptive training environments with task-directed virtual characters.

FILM / CINEMA / VR: Awake: Episode One by Start VRTrailer 
Created by Start VR, Awake: Episode One is an interactive cinematic virtual reality experience for HTC Vive and Vive Pro. Awake: Episode One, which uses the latest volumetric capture techniques to bring real-life human performances into VR, officially premiered at SXSW and has been touring the festival circuit ever since. It’s coming soon to Steam.

INDEPENDENT GAME: Black Iris by Hexa Game Studio Website
From Brazilian indie team Hexa Game Studio, Black Iris is an action RPG that takes inspiration from the Dark Souls series of games and Bloodborne. Black Iris in development for PC and console. 

INDEPENDENT GAME / AR: BOT-NET by Calvin LabsWebsite
BOT-NET is a game that turns physical space into a first-person battlefield using a mobile device’s AR features. Massive robots fight while the player engages in ground combat with smaller robots. BOT-NET is available in the App Store.

FILM / CINEMA: Cine Tracer by Matt WorkmanSteam
Developed by Matt Workman of Cinematography Database, Cine Tracer is a realistic cinematography simulator in which the player operates real world-based cameras, sets up lights, and directs talent within UE4 environments. Matt frequently livestreams Cine Tracer development at https://www.twitch.tv/cinegamedev. Creatives can use Cine Tracer to communicate lighting, cameras and storyboarding, and it’s available in Early Access on Steam.

INDEPENDENT GAME: Close to the Sun by Storm in a TeacupWebsite
Developed by Rome-based Storm in a Teacup, Close to the Sun is a first-person horror game that takes place in an alternate version of history in the 1890s aboard a mysterious ship complex created by Nikola Tesla where things are not as they seem. With numerous indie game accolades already under its belt, Close to the Sun is coming to PC and console in 2019.

TOOL / PLUGIN: coreDS Unreal by ds.toolsWebsite 
coreDS Unreal facilitates integration of both High-Level Architecture (HLA) and Distributed Interactive Simulation (DIS) in UE4 games and applications. Users can integrate once and support HLA and DIS without any other modifications to their UE4 application. coreDS Unreal provides an extensive feature set that eases the integration process, allowing for reduced implementation time, flexibility and highly customizable simulation behavior.

INDEPENDENT GAME: Farm Folks by OvergrownTrailer
Farm Folks is a successfully crowdfunded farming simulator game with a nod to the classic Harvest Moon series. Players can explore Softshoal Island, grow crops, raise livestock, build relationships and more – all the while uncovering the island’s mysteries. Farm Folks, coming to PC, is available for pre-order on Crytivo.

INDEPENDENT GAME / VR: Jupiter & Mars by Tigertron Website
Jupiter & Mars is an underwater adventure game for PlayStation 4 and PlayStation VR with a powerful message around climate change set in a shocking, future world inspired by ecological events happening now. The player controls Jupiter, a dolphin with enhanced echolocation powers, traveling around the world with AI companion Mars to disable the man-made machinery disrupting marine life, while solving puzzles and encountering magnificent creatures along the way. 

INDEPENDENT GAME / VR: Kaisuo by USC GamesTrailer
Kaisuo is a VR puzzle game in which players use fine motor dexterity to solve enigmatic Chinese puzzle boxes and unlock surreal, extraordinary spaces. Originally founded as an undergraduate student project named Lantern (now the name of the development team) at the University of Southern California, Kaisuo has been showcased at events such as the USC Games Expo and Indiecade @ E3, and is in development through the USC Bridge incubator program for full release on the Oculus and Steam stores.

INDEPENDENT GAME: Koral by Pantumaca BarcelonaSteam 
Developed by Carlos Coronado, one of Barcelona’s leading UE4 experts, this beautiful PC game takes players on a dive through the underwater world where they play as the current on a mission to revive coral reefs. Solving puzzles heals the reefs and replenishes the ocean’s magic. In addition, Carlos’ new training materials on going from zero to expert in UE4 have marked Udemy’s most successful launch of a Spanish game development course in the site’s history.

FINE ARTS / VR: Lemieux Pilon 4D ArtWebsite
The renowned duo of Michel Lemieux and Victor Pilon (4D Art) are creating an immersive museum art piece for virtual reality using UE4. 

INDEPENDENT GAME / VR: Mini World VR by Scaena StudiosWebsite
From Korea’s award-winning Chung Ang University 3D VR Lab, Scaena Studios’ Mini World VR is an immersive storytelling experience featuring elaborate hand-animated characters, game-based elements and intuitive interactivity. A cross between a game and a film, Mini World VR can be experienced from the perspective of both player and audience.

INDEPENDENT GAME: Mowin’ & Throwin’ by House Pixel GamesSteam
Available via Steam Early Access, Mowin’ & Throwin’ is a local multiplayer mashup of Bomberman meets Splatoon with a dash of Overcooked. Players control lawn gnomes in a race to wreck their opponent’s yard while keeping their own pristine. Victory goes to the best looking lawn! Mowin’ & Throwin’ is coming to party game collections for Nintendo Switch, PlayStation 4 and Xbox One in 2019.

FILM / CINEMA: Music Videos by Noisestorm – SoundCloud
Irish music producer and artist Noisestorm uses UE4 to create incredibly striking videos to accompany his musical tracks, which are often associated with trap, drum and bass, electro and dubstep. Now with nearly 10 million views, Crab Rave features thousands of CG crabs gathering after a tropical storm to dance it out. Noisestorm’s latest release, Breakout (feat. Foreign Beggars), depicts a tactical prison break with intense firefights, massive explosions, a high-energy helicopter chase and an amazing sniper shot. 

TOOL / PLUGIN: Optim by Theia InteractiveWebsite
Currently in alpha, the Optim plugin applies an accelerated workflow methodology to Unreal Engine’s Datasmith suite of tools and services for enterprise markets. Leveraging the efficiency of Datasmith and the power of Python, artists and designers can use Optim to visualize and customize their Datasmith import process for further optimization.

INDEPENDENT GAME / VR: Planetrism VR by Planetrism TeamGameplay
The future of humankind leads to the distant stars in this VR and PC adventure developed by Finnish duo Kimmo Kaunela and Mike Laaksonen. In Planetrism, players follow the opportunity of a lifetime to lead colonization on an uncharted planet, encountering untold mysteries while building a future for generations to come.

ARCHITECTURE / VR: Real Estate in Virtual Reality by REinVRWebsite
The real estate technology team at REinVR is focused on using UE4 to build advanced immersive consumer buying experiences using digital humans, AI and VR.

INDEPENDENT GAME: Risk One’s Neck by Royce GamesWebsite
Developed by Korean indie team Royce Games for PC and consoles, Risk One’s Neck is a vintage arcade-style beat ’em up game set in a brutal, realistic urban environment. An homage to the Capcom arcade fighters of the 1980s, Risk One’s Neck channels thrilling gameplay for players of all skill levels.

FILM / CINEMA: Robots’ Design Academy by Eric Liu Blog
A student film by Eric Liu, this 12-minute cinematic highlights the art of the possible when a single person sets out to do something wonderful. Powered by the drive and passion to create something spectacular, Eric created a wordless tale about creativity and daring to be different. It follows a robot student learning to design after most of humanity has become extinct from some unknown apocalypse. Dismayed by the institution’s insistence on strictly copying human creations perfectly, the droid protagonist sets out to design something bold and unique with the help of a newfound human pal.

LEARNING RESOURCE: Russian UE4 Lessons and CommunityWebsite YouTube 
This incredible volunteer-driven resource for the Russian development community has been in operation since the public launch of UE4 in 2014. Featuring translations of exhaustive release notes for dozens of major engine updates, along with hundreds of localized tutorials — all created independently, and freely shared online — the group has well over 50,000 members across their networks, which also include popular Unreal Engine Discord and VK channels.

INDEPENDENT GAME: S.O.N by RedG StudiosWebsite 
S.O.N is a modern-day psychological survival horror game in which a father searches for his son who has gone missing deep in the Pennsylvania forest, better known as South Of Nowhere. In a world where fear takes control and the past is never erased, questions linger around what demons will be faced to get back a loved ones. S.O.N is coming to PlayStation 4.

INDEPENDENT GAME: Spellbreak by Proletariat Inc. –  Website
With talent from game studios such as Harmonix, Turbine and Insomniac, Proletariat is bringing a magical twist to battle royale. Currently in pre-alpha on PC, Spellbreak puts a new spin on the genre with its fantasy art style and powerful magic spells that can be explosive when used in combat.

FILM / CINEMA: The Abyss by Kemal GünelVideo
This real-time short film depicts an ominous scenario aboard a desolate spaceship. Built using Kemal’s assets that are available on the Unreal Engine Marketplace, the project is also the basis for his popular UE4 Lighting tutorial series, which has 35 videos and counting.

INDEPENDENT GAME: The Cycle by YAGERWebsite
Currently in Closed Alpha, The Cycle is the latest FPS game from Berlin-based YAGER. Up to 20 players go head to head to fulfill contracts during matches about 20 minutes in length. The Cycle is planned for PC release in early 2019 with support for consoles to follow.

AR / VR: The Hydrous presents: ImmerseWebsite
Jason McGuigan and his team at Horizon Productions have been on the bleeding edge of XR for several years, with a library of AR and VR projects built with UE4 under their belt. A pre-release version of Immerse took the stage at the recent Trojan Horse Was a Unicorn gathering in Malta presented by Dr. Erika Woolsey, CEO of the Hydrous. The Hydrous’ mission is to create open access oceans by bringing conservation education to the masses. Horizon also presented a high-fidelity VR art gallery created in Unreal Engine that featured almost 100 paintings by some of the world’s leading digital artists.

FINE ARTS / VR: The Kremer Collection Virtual MuseumWebsite
Designed by architect Johan van Lierop, Founder of Architales and Principal at Studio Libeskind, the Kremer Museum features 17th Century Dutch and Flemish Old Master paintings from the Kremer Collection and is accessible through Viveport, Steam and Oculus. 

TOOL / PLUGIN: Tools and Plugins by VR ProfessionalsVideoWebsite
Russia-based VR Professionals are on a mission to create more affordable and accessible “out of the box” solutions for VR training and education using UE4. Having identified a desire for UE4 apps to be more deeply integrated into enterprise ecosystems, e.g., SQL databases, analytics, reports, LMS and CRM systems, VR Professionals are developing UE4 tools and plugins to help organizations streamline their use of B2B apps faster and with lower costs. 
 

FILM / CINEMA: Unannounced project by Kite & LightningWebsite
The recipient of the 2018 SIGGRAPH Best Real-Time Graphics and Interactivity Award at the recent Real-Time Live! showcase, Kite & Lightning wowed audiences with the presentation “Democratizing Mocap: Real-Time Full Performance Motion Capture with an iPhone X, Xsens, IKINEMA and Unreal Engine.” This Unreal Dev Grant is given in support of new breakthroughs in live performance-driven entertainment.

INDEPENDENT GAME: Unbound: Worlds Apart by Alien Pixel StudiosSteam
Unbound: Worlds Apart is an atmospheric 2D puzzle platformer in which the player can conjure magic portals to travel between different realities and learn more about a catastrophe that has ravaged his world. Inside certain portals, the physical properties of the character or world elements can change, offering new gameplay possibilities. A dark fairy tale with a cartoonish style, Unbound: Worlds Apart is planned for release on PC and consoles in 2020.

TOOL / PLUGIN:  VR Cataract Training Solution by Surgical MindVideo 
Surgical Mind, a branch of Korea-based Mania Mind, is developing a cutting-edge VR simulator for cataract surgery to enable medical residents to better hone their skills before getting near an eye. Their team maintains that VR simulation training improves performance, minimizes risk and provides greater detail around potential scenarios more efficiently than expensive physical simulators.  

To learn more about Unreal Dev Grants and to apply, visit: http://unrealengine.com/unrealdevgrants
 

How Real-Time Tech is Reinventing M&E Pipelines: New Forrester Report

At Epic Games, we believe that the opportunity for architects, manufacturers, media and entertainment companies, and designers to transform their businesses through real-time technology is huge. To get a reading on the pulse, and to see where things ma…

Creating Realistic Digital Humans Using UE4 Livestream Recap

The graphics industry is continually striving to achieve photorealistic visuals. While we’ve largely been able to capture realistic environments, creating believable faces is orders of magnitudes more challenging.

This is due to the fact that if a digital human looks even slightly off, they can enter what is known as the “uncanny valley,” making faces look eerie. That is why many developers often opt to create highly stylized faces, which circumvents the issue, but doesn’t solve it. 

At Epic, we are tackling this difficult problem head-on. To do so, we had to get a deeper understanding of the human anatomy. We then built new tools that allowed us to create believable digital humans the likes of which the world had never seen in real time. Our Virtual Mike, Siren, and Andy Serkis demos exemplify our major graphical breakthroughs. 

In a recent livestream, Senior Character Artist Adam Skutt demonstrated how we created Virtual Mike, which is available for you to download and experiment with through the Epic Games Launcher. 
 

Improvements to the way we render skin represents one of the largest leaps we’ve made to advancing real-time digital humans. In the livestream, Skutt walks you through how we took a facial scan of researcher and writer Mike Seymour, crafted skin that realistically captures the roughness and oils of the human face and utilized two normal maps to drive detailed facial animations. Highlighting the updated Subsurface Profile Shading Model, which allows skin in Unreal Engine to more believably interact with light, Skutt shares how to solve the fake-looking “CG grey” shadow effect that typically plagues digital humans by also leveraging a Screen Space Irradiance post process material. 

Advanced hair is created by using a mixture of individual splines coupled with innovative techniques that delve into how we blend follicles into the scalp. Finally, Skutt delves into how we create extremely convincing eyes. 

The culmination of all these new rendering tools amounted to a digital face so realistic that, during the livestream, even Skutt had a hard time distinguishing the in-engine render from the real-life reference photo. 

For an in-depth look on how we were able to create lifelike digital humans, make sure to watch the embedded livestream above. For additional information, dig deeper with our technical documentation and stay tuned for more content on the subject. 

If you would like to experiment with our Digital Human engine feature sample yourself, make sure to download Unreal Engine 4.20 for free today and check it out through the Learn tab within the Epic Games Launcher.