Unreal Powers On-the-Fly Compositing with Handheld Camera

At NAB 2018 in Las Vegas, AMD partnered with ARwall to showcase a new use for real-time rendering in virtual production: an augmented reality wall that responds to changes in camera perspective, thus eliminating the need for compositing green screen footage with rendered CG in post-production. The ARwall Effects (ARFX) System, running on AMD hardware, was on display for all NAB participants to demo.

With the ARFX System, real-time images, computed with Unreal Engine, are displayed on a screen large enough to serve as a scene’s backdrop, and the images update in real time in response to the camera’s movement. When live actors stand in front of the wall, the result is an instantly “composited” shot — a real-time 3D virtual set extension.
 

AMD showcases ARwall technology at NAB 2018

“This is akin to what you’d do with a green screen, but now there’s no green screen,” says Frank Vitz, Director of the Immersive Technology Team at AMD.  “The camera is free to move around. You see your composite happening right there in the camera.”
 
With the ARwall system, the artifacts that can affect a green screen shoot are no longer a concern. “You don’t have green spill. You don’t have problems with hair,” says Vitz.
 
The ARFX System is an evolution of rear projection techniques, where a filmmaker projects a previously recorded film of the environment on a screen behind an actor during live filming. However, such techniques can only support a viewing angle perpendicular to the screen—any other angles immediately ruin the effect.
 
With the ARFX System, the camera can move and shoot at any angle, and roam anywhere in the virtual set with a hand-held camera for a fully immersive filmmaking experience. Perspective of near and far elements in the CG environment respond as expected in real time, with none of the limitations of a 2D projection. In addition, actors can now see and react to the environment, helping to enhance their performances.
Sample use: practical set is a walkable platform (shown in front of a blank video wall).
Projection on video wall can provide virtual set extensions (railings, staircases) calibrated to line up with practical set. When the camera moves, perspectives of set extensions change to always remain “attached” to the practical set at the appropriate angle.
The rest of the environment is added to the projection, providing a full-screen backdrop for the scene updating in real time, even with a moving or handheld camera.
Actors on the practical set can be shot live against the video wall for an in-camera composite, with background angles updating in real time as the camera moves.

“It’s kind of the Holy Grail of virtual production to be able to make a seamless composite on the fly,” says Vitz. “This is amazing new technology that adds another tool to the toolkit for virtual production.”

To generate real-time updates to the background, ARFX uses Unreal Engine in conjunction with a set of trackers attached to the camera. The system can also employ AMD’s Advanced Media Framework (AMF) and IRT sensor to provide lighting information to Unreal Engine. A Zcam 360 camera collects lighting information which Unreal Engine uses to light the virtual set extension to match the practical set and respond to any lighting changes in real time. The system shown at NAB is powered by the AMD Ryzen Threadripper and two Radeon Pro WX 9100 GPUs. 
 
“Filmmakers are the most demanding creators when it comes to realism and fidelity,” says Rene Amador, CEO of ARwall. “We knew that in order to make ARwall successful, we would need an engine that could handle whatever we threw at it, as well as the rigor and demands of a production environment. Unreal Engine has met these needs and more. Every update that shows up is like Christmas morning for the team.”

Want to try out Unreal Engine? Join the Unreal Studio beta today and get access to Unreal Engine plus export/import tools, learning videos, and more!

Unreal Engine Drives Monster Puppet for The Mill and Monster.com

When employment website Monster.com needed a new commercial spot, they hired The Mill, an international VFX and creative house with countless commercials to their credit. The spot features a giant, purple, hairy monster who rescues an unhappy employee and carries her, King-Kong style, to a new employment situation. The 1:30 spot, called Opportunity Roars, garnered numerous accolades for The Mill including a Cannes Lion Award for Visual Effects.
 

Award-winning commercial for Monster.com, “Opportunity Roars”

 
But there was little time for The Mill to rest on their laurels. After the success of that spot, Monster.com came back to the production house with a request for more than two dozen 15-second animated spots featuring the purple monster. 

The only problem was the turnaround time—a mere three weeks.

While The Mill used traditional techniques to produce the original commercial, this workflow wasn’t an option for such a short time frame. “What do you do with that?” says Boo Wong, Group Director of Emerging Technology at The Mill. “We realized there was no way to go down the traditional route.”
 

A faster way to animate: real-time motion capture and rendering 

That’s when The Mill came up with a clever solution: they would use a Leap Motion system to control the monster with finger motions, a kind of virtual puppeteering. The motions would drive the rig and generate real-time finished output of the monster, purple fur and all, with Unreal Engine.

Using the Leap Motion system to drive the rig  
 

“We pitched it to Monster.com, and they were blown away by the capabilities,” says Jeffrey Dates, Creative Director at The Mill. 

The team quickly put together a system and brought in the agency and clients for a live puppeteering and recording session. “They would give us notes on animation in real time,” says Dates. “As fast as they could say it, we then would make these adjustments and re-perform the next take.”

Performing live takes with finger motions and Unreal Engine
 
For the output, The Mill set up real-time post effects in Unreal Engine that encompassed everything they would typically do to finish a shot. As a result, the animation recorded in real time was ready for use without further processing. “The entire animation pipeline was happening in that span of just a few minutes,” says Dates.

“The client walked away with hours of finished quality work, basically final pixels,” says Joji Tsuruga, Real-time Supervisor at The Mill. The monster animation was used in videos for social media such as Touchdown Dance and Meet Your Purple Fuzzy Career Coach on Monster.com’s Facebook page.

The process set a new bar for character animation output. “It’s really unheard of in animation to get multiple takes of a performance,” says Wong. “For an editor to basically walk away with selects, was really groundbreaking.”

Exploring the future with real-time rendering

Inspired by the success of the Monster.com project, The Mill sees real-time animation as an important new paradigm. “Integrating game engines into your production workflow is critical,” says Wong. “It’s essential in storytelling today.”

A few of the many monster motions generated in real time with Unreal Engine
 
They also recognize the practical aspect of real-time rendering with Unreal Engine for short turnaround times. “This project answers the question, how can we generate a lot of animation cost-effectively for social media?” says Dates. “I’m not rendering, I’m not watching it render.

“Now, it’s more like experimenting. I want to do it and do it fast and have fun doing it.”

Making your own animation magic

Want to try out real-time rendering for your own projects? Join the Unreal Studio beta today and start creating!
 

Vive Teams up with STEM Innovators at Nanome

This is a guest blog written by Adam Simon and James Jarrell from Nanome, with illustration by Kendra Black. I’m not a scientist, and neither are most of my friends. But why not? Most of them are plenty smart, have successful careers in something else, and many are even interested in STEM fields. What’s going on here? What keeps intelligent,

The post Vive Teams up with STEM Innovators at Nanome appeared first on VIVE Blog.

REWIND Uses Unreal Engine to Bring VR Hacker Hostel to Life

When UK-based company REWIND set out to create a virtual reality experience for HBO’s hit comedy series Silicon Valley, their goal was to make something that brought the fans themselves into the show. The result was Silicon Valley: Inside the Hacker H…

Exploring the Process Behind Polyarc Games’ Moss

With the critically acclaimed Moss becoming available on the HTC VIVE earlier this month, we took some time out to chat with Danny Bulla, the Co-founder and Design Director from Polyarc Games, to talk about the game and the process of bringing their hugely successful title to PC based VR. Firstly, can you tell our audience a little about Moss

The post Exploring the Process Behind Polyarc Games’ Moss appeared first on VIVE Blog.

Epic Games Announces $1 Million in Unreal Dev Grants

Epic Games has announced the recipients of its latest round of Unreal Dev Grants, with 37 teams and creators receiving a total of $1 million in no-strings-attached funding for games, tools, broadcast and beyond. The Unreal Dev Grants program was established in February 2015 as a $5 million fund for promising developers working with Unreal Engine 4; awards range from $5,000 to $50,000 with no restrictions or obligations to Epic Games. This latest round of Unreal Dev Grants underscores the variety of applications for Unreal Engine, with software plugins, VR games, AI-driven educational platforms, and healthcare tools all receiving financial assistance.

 
“The Unreal Dev Grants program was designed to give studios and other developers a boost to bring their promising working prototypes to market, and to give back to the wider Unreal developer community as they use the engine in interesting ways,” said Chance Ivey, Partnership Manager at Epic Games. “This new round is our biggest yet, and we are blown away by the potential of projects like Kara Education, an AI-driven online educational platform for kids with hearing difficulties; VStore, a VR tool for quick and portable early dementia screenings; and Anima, a robust crowd simulation plugin for Unreal that can be leveraged by any game designer or VFX artist working with Unreal. Congratulations to all of these recipients on their exciting work; we can’t wait to see what’s next!”
 
The new round of grant recipients includes:
 

 
Amid Evil by New Blood InteractiveSteam Page
In this 90s-inspired FPS adventure, players take on evil hordes with sacred magical weaponry and spells. Travel through seven distinct episodes to hone your skills against evil and reclaim your world.
 
Anima Plugin for UE4 by AXYZ DesignWebsite
anima© 3.0 by AXYZ Design is a robust crowd simulation plugin for Unreal Engine that helps architects, designers and visualization professionals to populate hundreds of 3D characters and bring them into Unreal Engine 4 scenes with just a few mouse clicks.
 

Atomic Heart by MundfishWebsite
Atomic Heart is a first-person shooter that takes players inside an alternate universe during the high noon of the Soviet Union. Follow a Soviet special agent to unlock thrilling secrets and show the Motherland what you’re made of.
 

Brief Battles by Juicy Cupcake Website
A hilarious platform fighter, Brief Battles uses underpants as a weapon. Players can choose from a variety of super-powered underpants and different game modes as they compete to see who has the mightiest buns.
 
Project “Caillou” (Working title) by AwacebVideoWebsite
Awaceb is a small indie team of two childhood friends from New Caledonia in the French territory of the Pacific. Project “Caillou” tells touching, poetic stories in an open, physics-based and dynamic 3D world.
 
Desolate Sands by Blacksmith StudiosWebsite
Desolate Sands is a VR puzzle game set in an ancient pyramid. The objective is to reach the bottom by leveraging nearby movable objects and levers to solve a series of complex puzzles.
 
ELLI by Bandana KidWebsite
In this puzzle platforming game, players travel as ancient time guardian Elli on a journey to recover the stolen sands of time.
 

 
Feri VinczeWebsite
Feri Vincze is a freelance 3D artist and videographer using Unreal Engine 4 to render elegant animated films such as “The Chest.”
 

Ghost by Sky Machine StudiosWebsite
Ghost is a sandbox stealth game set in a Victorian era world, where players seek to uncover the truth about Arthur Artorias’ past. In this game, light is the enemy – stay in the shadows and explore the darkness to survive.
 
Guntastic by Ludicrous GamesWebsite
Guntastic is an arcade-inspired game involving shooting and mayhem. Battle with up to four local or online players in fast-paced, one-shot one-kill, nonstop matches over continuously changing levels.
 

 
Headsnatchers by IguanabeeSteam Page
Headsnatchers is a multiplayer party game where players each try to keep their heads on their shoulders while trying to remove everyone else’s. Play across four different game modes in over 25 unique environments, each with different rules and attributes.
 
Hear No Evil by Rockodile GamesWebsite
Hear No Evil tells the story of humanity’s last remnants, who return to earth to fight for their future. This game is a visually spectacular top-down action shooter inspired by Alienation and Helldivers.
 

 
Hellbound by Saibot StudiosWebsite
Hellbound is a demonic first-person shooter game inspired by classic 90s games like DOOM and Quake created by an independent team of developers in Argentina.
 

 
Kara Education by Kara TechnologiesWebsite
Kara is an online education platform for children with hearing difficulties. The platform delivers educational material using an avatar, driven by Kara’s AI and machine learning algorithms and Unreal Engine.
 

 
Kine by Gwen FreySteam Page
Kine is a 3D puzzle game about three whimsical machines that aspire to be musicians. Players embark across a theatrically rendered cityscape and solve increasingly difficult 3D puzzles to help the machines form a band and catch their big break.
 

 
Little Devil Inside by NeostreamWebsite
Little Devil Inside is an engaging 3D action adventure RPG game where players are thrown into a surreal environment with elements that challenge their survival instincts. Explore, adapt, and fight to survive.
 

 
M.A.S.S Builder by Vermillion Digital Co., Ltd. – Facebook
M.A.S.S. Builder pits humans against invading aliens in the fight for the Earth. Players will build, customize and control the ultimate M.A.S.S. (Mechanical Assault Skeleton Suit) in an effort to save the world.
 

 
Midnight Ghost Hunt by MellowsoftWebsite
Midnight Ghost Hunt is a multiplayer hide-and-seek game that pits a team of Ghosts against a team of Ghost Hunters. The hunt begins at the stroke of midnight, and it’s a race against the clock to either stay hidden or uncover the ghosts.
 

 
NanoSpace by Synthetic SystemsSteam Page
Nanospace is a 3D platformer with elements of real-time strategy. Players take control of three “nano-mites” in levels full of riddles, monsters, inventory and more.
 
Neon Giant – Unannounced TitleWebsite
Neon Giant is a group of game veterans with a experience in some of the world’s biggest action game franchises. The studio is hard at work on its first title, set in a brand new cyberpunk world.
 
New Reality Co. – Unannounced ProjectWebsite
New Reality Co is a creative studio by Milica Zec and Winslow Porter, dedicated to synthesizing storytelling, art, and technology into groundbreaking and emotional projects. New Reality are the creators behind the award-winning Giant and Tree VR experiences, both of which are built with Unreal and previously benefitted from Unreal Dev Grants.
 
NotMyCar by NMC StudiosVideoWebsite
NotMyCar is a white-knuckle, lead-footed massive multiplayer vehicular combat battle royale game. Drop into the battleground and use cool weapons and abilities to fight your way through single-elimination combat and become the ultimate survivor. Customize your ride to make it a beast of a vehicle to take on anyone, anytime.
 

 
Oceanhorn 2 by Cornfox and Bros.Website
Oceanhorn 2, which was showcased in the Unreal Engine booth at GDC 2018, is the upcoming sequel to the action-adventure mobile game featuring exploration in a colorful world with items, puzzles and battles.
 

 
Origin Zero by Black Amber Digital – Website
Origin Zero is an episode-based, sci-fi animation project lovingly crafted by a small, dedicated team using Unreal Engine 4.
 

 
Paradise Lost by PolyAmorous  – Website
Paradise Lost is a non-linear narrative driven adventure game with meaningful, kinesthetic interactions dynamically changing both the environment and the story you are experiencing.
 

 
The Path of Calydra by FinalbossWebsite
The Path of Calydra is an 3D adventure platformer set in the fantastic world of Calygore. Explore as suburban teenager Matheus, who has been transported to Calygore and must rely on an unusual entity named Calydra to seek out four powerful crystals and return home.
 
Point Cloud Plugin by PhobozForum Post
The Point Cloud Plugin by Phoboz is a free plugin for Unreal Engine 4, developed to help with importing, processing and rendering of point clouds. It is currently in beta for Windows.
 

 
Raji: An Ancient Epic by Nodding Head GamesWebsite
Raji: An Ancient Epic is an action adventure game set in ancient India. Raji is a young girl chosen by the gods to stand against the demonic invasion of the human realm, saving her younger brother in the process.
 
Rocket Jockey by Burn Ward Games – Website
Rocket Jockey is a team-based game that plays like a cross between Rocket League and Super Smash Bros. Fly on top of modern jet engines with classic car chassis at break-neck speeds.
 

 
Scene Fusion by KinematicSoup TechnologiesWebsite
Scene Fusion for Unreal Engine makes real-time editor collaboration possible. Developers can build all sorts of content together in real time, resulting in significant time savings.
 
Second Order
Independent developer Second Order, creator of Claybook, is being recognized and awarded for contributing numerous fantastic rendering features and optimizations to Unreal Engine 4. This is the team’s second Unreal Dev Grant.
 

 
Session by Crea-ture StudiosWebsite
Inspired by the golden era skate culture of the late 90s and early 2000s, Session is an upcoming skateboarding game that is all about authenticity, creativity and the freedom of expression that skateboarding provides.
 

 
SMALLAND by EMBU GamesWebsite
With SMALLAND, the survival genre gets a tinier take that lets you appreciate the little things, or flee in terror of the little things. The slightest breeze can sweep your items or even your house away from you. The simplest rainfall can form puddles the size of lakes in a matter of minutes.
 

 
Solar Warden by Polar ZenithWebsite
Solar Warden is a six-degrees-of-freedom space shooter combined with an overarching campaign with real time strategy elements. Jump into your fighter and combat the silicoid menace up close, while you command and dispatch the Solar Warden fleet for reinforcements.
 

 
Someday You’ll Return by CBE SoftwareWebsite
Someday You’ll Return is a story-driven psychological horror game about a desperate search for a missing daughter deep in the woods where you swore you’d never return.
 
Twenty StudiosWebsite
Sweden’s Twenty Studios, together with SuperFly.tv and LeViteZer, is crafting intuitive open source software that brings the power of Unreal Engine 4 to live mixed reality production and video compositing pipelines.
 
VStore by VitaeVRWebsite
Vstore, which recently entered wide clinical trials, is a fully functioning virtual reality supermarket which offers a fast, accurate and portable method of screening for early indicators of dementia. Diagnosing dementia at the earliest possible stages is critical because that is when treatment is at its most effective.
 
To learn more about Unreal Dev Grants and to apply for a future award, visit: http://unrealengine.com/unrealdevgrants

Industry Leaders in Design Come Together at Build: Stockholm’18

Real-time technology is transforming the creative world in incredible ways, and it’s amazing to see how more and more designers across many different industries are building new innovations with the power of Unreal Engine.

Our real-time platform cont…

Vive Libraries Program launches in California and Nevada

This is a guest post from Chris Chin, Executive Director, Education VR Content at HTC VIVE. When I was a kid, I spent many an hour visiting the local public library.  It offered a very cool tropical aquarium, the occasional puppet show, and of course access to a treasure trove of books – all for free!  The one thing that resonated

The post Vive Libraries Program launches in California and Nevada appeared first on VIVE Blog.

How Real-Time Tech Is Changing The Auto Design Landscape

Designing and manufacturing the next generation of stunning automobiles is a complex, lengthy process that can take many years to complete. Perfecting a new model design requires exhaustive prototyping, testing, review, and iteration across what has traditionally been an extensive timeline. We’re reaching an exciting crossroads, however, where modern innovation is ushering in big changes for those who build and sell automotive vehicles. 

More and more we’re seeing major car manufacturers turn to real-time technology to speed up, simplify, and enhance their automotive design experience. Efficiency is at the heart of the growing trend towards real-time design practices in the automotive world. Many brands are eschewing expensive and time-consuming traditional design approaches that still rely on iterating with resin models and costly physical design prototypes. Instead, they’re turning to new techniques that incorporate remote collaboration, 3D visualization, virtual reality, and on-the-fly customization powered by real-time engines.

Enhancing the design process through real-time

CGI and rendering has been a big part of design visualization in the automotive industry for many years now, but working with real-time engines instead of relying on static visualizations allows for faster iteration and a broader range of experience throughout the design process. It’s a more recent shift that’s unlocking a whole new world of potential for improving the craft of auto design.

Using Unreal Engine-powered technologies, innovators in the design space are helping to accelerate key creative and decision making workflows while opening up avenues for car manufacturers to explore entirely new design experiences. Nvidia’s Holodeck, for example, makes it possible for teams located anywhere in the world to come together and collaborate virtually in a photorealistic VR design space. This could revolutionize how designers in global companies work together, letting them create faster and more collaboratively.

BMW recently implemented a mixed reality lab into its automotive design pipeline, which pairs a VR headset with a physical vehicle interior model to create an immersive and tactile 360 real-time experience. The system gives designers a feel for the driver’s experience with a prototype model before the car is even built, allowing for more informed decision making throughout the design process. Following suit, we’re seeing other major automotive brands also utilizing real-time design in different ways to enhance and improve production cycles.

Beyond its ability to bring unique interactive experiences into the design process, real-time technology empowers greater efficiency across the board. Design feedback often happens very quickly once it’s under review, but it can still take weeks for changes to be implemented using traditional methods. Designers now have the power to implement same-day changes, giving them greater creative flexibility and more opportunities to iterate without the added downtime. What once took weeks can now be done in days or less. This can also help teams identify design issues faster in the creative cycle, significantly reducing the potential for flaws that might carry forward into later stretches of production.

Synergy is equally important throughout the automotive world, and anywhere manufacturers can find ways to make good use of existing design product can save time and money in the long run. Assets created using real-time technology have the benefit of being easily recycled across other areas of the production and marketing pipeline, essentially letting designers do the heavy lifting once then propagating their work easily, whether it’s beauty renders and promotional materials or customer-side real-time visualizations.

Creating a new automotive customer experience

What’s exciting is how the potential of real time tech is rapidly expanding to other areas of the industry. It’s not just about forging a more efficient design process; giving designers greater flexibility to customize their work in real time makes it possible to shape each experience to suit the specific needs of clients, customers, and stakeholders. This is vital for internal review sessions and big presentations with key decision makers, but it’s also a game-changer in the automotive retail space, too.

The ability to create custom photorealistic configurations of different car models in real time to suit an individual customer’s tastes — right down to package options and color choices — is a powerful tool for helping to sell automobiles. Custom car configurators are becoming increasingly commonplace, and companies like Volkswagen are taking the tech even further. VW Sweden recently hired Animech, a visualization tools design firm, to create an immersive VR configurator experience that lets customers don a VR headset and interact with a fully customizable, photorealistic car in real-time without leaving their seat. It’s another great example of the innovation we’re seeing, as the industry warms up to the possibilities of real-time design.

We’re in an exciting age of experience, and it’s only going to grow as the automotive industry continues to evolve and embrace the potential of now.

Ready to accelerate your design workflows with Unreal Engine? Get the Unreal Studio free beta and level-up your real-time visualizations today!

 

How Real-Time Tech Is Changing The Auto Design Landscape

Designing and manufacturing the next generation of stunning automobiles is a complex, lengthy process that can take many years to complete. Perfecting a new model design requires exhaustive prototyping, testing, review, and iteration across what has traditionally been an extensive timeline. We’re reaching an exciting crossroads, however, where modern innovation is ushering in big changes for those who build and sell automotive vehicles. 

More and more we’re seeing major car manufacturers turn to real-time technology to speed up, simplify, and enhance their automotive design experience. Efficiency is at the heart of the growing trend towards real-time design practices in the automotive world. Many brands are eschewing expensive and time-consuming traditional design approaches that still rely on iterating with resin models and costly physical design prototypes. Instead, they’re turning to new techniques that incorporate remote collaboration, 3D visualization, virtual reality, and on-the-fly customization powered by real-time engines.

Enhancing the design process through real-time

CGI and rendering has been a big part of design visualization in the automotive industry for many years now, but working with real-time engines instead of relying on static visualizations allows for faster iteration and a broader range of experience throughout the design process. It’s a more recent shift that’s unlocking a whole new world of potential for improving the craft of auto design.

Using Unreal Engine-powered technologies, innovators in the design space are helping to accelerate key creative and decision making workflows while opening up avenues for car manufacturers to explore entirely new design experiences. Nvidia’s Holodeck, for example, makes it possible for teams located anywhere in the world to come together and collaborate virtually in a photorealistic VR design space. This could revolutionize how designers in global companies work together, letting them create faster and more collaboratively.

BMW recently implemented a mixed reality lab into its automotive design pipeline, which pairs a VR headset with a physical vehicle interior model to create an immersive and tactile 360 real-time experience. The system gives designers a feel for the driver’s experience with a prototype model before the car is even built, allowing for more informed decision making throughout the design process. Following suit, we’re seeing other major automotive brands also utilizing real-time design in different ways to enhance and improve production cycles.

Beyond its ability to bring unique interactive experiences into the design process, real-time technology empowers greater efficiency across the board. Design feedback often happens very quickly once it’s under review, but it can still take weeks for changes to be implemented using traditional methods. Designers now have the power to implement same-day changes, giving them greater creative flexibility and more opportunities to iterate without the added downtime. What once took weeks can now be done in days or less. This can also help teams identify design issues faster in the creative cycle, significantly reducing the potential for flaws that might carry forward into later stretches of production.

Synergy is equally important throughout the automotive world, and anywhere manufacturers can find ways to make good use of existing design product can save time and money in the long run. Assets created using real-time technology have the benefit of being easily recycled across other areas of the production and marketing pipeline, essentially letting designers do the heavy lifting once then propagating their work easily, whether it’s beauty renders and promotional materials or customer-side real-time visualizations.

Creating a new automotive customer experience

What’s exciting is how the potential of real time tech is rapidly expanding to other areas of the industry. It’s not just about forging a more efficient design process; giving designers greater flexibility to customize their work in real time makes it possible to shape each experience to suit the specific needs of clients, customers, and stakeholders. This is vital for internal review sessions and big presentations with key decision makers, but it’s also a game-changer in the automotive retail space, too.

The ability to create custom photorealistic configurations of different car models in real time to suit an individual customer’s tastes — right down to package options and color choices — is a powerful tool for helping to sell automobiles. Custom car configurators are becoming increasingly commonplace, and companies like Volkswagen are taking the tech even further. VW Sweden recently hired Animech, a visualization tools design firm, to create an immersive VR configurator experience that lets customers don a VR headset and interact with a fully customizable, photorealistic car in real-time without leaving their seat. It’s another great example of the innovation we’re seeing, as the industry warms up to the possibilities of real-time design.

We’re in an exciting age of experience, and it’s only going to grow as the automotive industry continues to evolve and embrace the potential of now.

Ready to accelerate your design workflows with Unreal Engine? Get the Unreal Studio free beta and level-up your real-time visualizations today!