How Animech Designed a Custom Volkswagen VR Experience That Sells

When it comes to designing a memorable interaction that resonates with customers and sticks in their minds long after the experience is over, few modern tools are as effective and powerful as virtual reality. For automotive manufacturers, VR offers the perfect mix of total immersion and unparalleled customizability that only real-time technology can deliver. The exciting new potential it unlocks is spurring major automotive brands to adopt a fresh approach to how it engages customers through real-time experiences.

Based in Uppsala, Sweden, Animech specializes in using Unreal Engine for creating visualization tools to help clients elevate their brands. The engineering and design visualization firm recently caught the eye of Volkswagen Sweden, who commissioned the team to build a groundbreaking VR car configurator experience to help showcase its new Arteon model.

Using a HTC Vive and Animech’s VR car configurator, customers can virtually explore the interior and exterior of Volkswagen with full photorealistic customization.

Designing to sell

Animech first started working in Unreal Engine in 2015, when the studio was tasked with visualizing the tallest skyscraper in Northern Europe for a client’s VR Google Cardboard application. Using Unreal let the team deliver a next-level interactive visual experience, says Animech business director Staffan Hagberg. Showcasing the high-fidelity results from that project during a VR event at the Animech office had a big impact on Volkswagen Sweden representatives who attended, spurring them to tap into the potential of VR as a tool for selling cars in the future.

They needed a way to show its new Arteon model to potential customers but at the time didn’t have any models available in its dealerships to entice consumers to pre-order before the car’s official launch. The many potential Arteon configurations available added another challenge to the mix, however, Animech designed a custom-tailored real-time VR configurator that let consumers get up close and personal with the Arteon in ways that wouldn’t have otherwise been possible before physical demo models were available.

The Unreal Engine-powered VR experience offers unparalleled photorealism and immersion for customers eager to explore the possibilities of customization.

The unique experience let consumers don VR goggles in order to get an immersive, photorealistic look at the Arteon’s interior and exterior from every angle in a fully customizable and interactive way. Swapping features, colors, and other important visual details in real time let users get a feel for how their custom spec options would look. Additionally, the experience tied into the central business systems, which allowed dealers to easily produce a price quote based on a user’s customizations. It proved a powerfully effective tool for driving pre-orders.

“Visual quality is very important for Volkswagen, and that’s why I recommended and pushed for Unreal Engine,” says Aidin Abedi, technical manager at Animech. “The biggest benefit is that you get everything out of the box, giving the artists the ability to create the perfect scenes, the perfect lighting, the perfect materials without having to go through developers and tweak it to perfection.”

Having access to hands-on support when it was needed was also critical for overcoming obstacles and bringing the design vision to life during development. “When we called the Unreal Enterprise team and asked for support, they had a guy in Sweden and they sent him to our office and he came by and helped us and sat with the developers,” says Hagberg. “Feeling that the Enterprise team had our back, that was tremendously important for us.”

“VW Sweden trusted in us and they let us do our thing. When we showed them the first version we really blew their mind, because they did not expect it to look as good as it did,” he adds.

The difference between the virtual model in Unreal Engine and the real thing is intangible, giving users a hyper-realistic VR experience even when a physical model isn’t available.

VR the new norm for auto dealers?

Animech’s VR experience built with Unreal Engine has become a standard offering at dealer showrooms for Volkswagen Sweden, significantly changing the landscape of customer experience. While it’s still a newer addition in dealerships, this could soon become the industry-wide norm, given how effective it is.

“Almost everybody starts to laugh if it’s the first time they try a VR experience,” says John Anderson, private car sales associate at Volkswagen. “They don’t really know what’s real and what’s not because it looks like you’re inside a dealership, except they are looking at the car exactly as they do in here. They get really happy.”

Watch the video above to see Animech’s VR auto configurator experience in action! 

Ready to level-up your design visualizations in Unreal Engine? Try the new Unreal Studio beta today!

Technology Sneak Peek: Real-Time Ray Tracing with Unreal Engine

At GDC 2018, we got a glimpse into an exciting new technology for Unreal Engine: ray tracing combined with real-time rendering. In a joint announcement with NVIDIA and ILMxLAB, Epic unveiled this unprecedented step towards ever-better quality in real-time rendering.

In offline renderers, ray tracing has long been used to achieve photoreal results, particularly in scenes with a lot of reflections and with complex translucent materials such as thick glass. Now, leveraging the efforts of several technology companies, support for Microsoft’s DXR API and ray-traced area light shadows will be available in the main branch of Unreal Engine 4 by the end of 2018. Epic expects to release binary support for these features in Unreal Engine 4.22, with more features to follow as they become production-ready.

What’s the big deal?

Renderers tend to use one of two methods for determining pixel color: rasterization or ray tracing. Rasterization starts with a particular pixel and asks, “What color should this pixel be?” Ray tracing works from the viewing angle and light sources and asks, “What is the light doing?”

Ray tracing works by tracing the path of a light ray as it bounces around a scene. Each time a ray bounces, it mimics real-life light by depositing color from earlier objects it has struck, and also loses intensity. This depositing of color makes the sharp reflections and subtle, realistic color variations that, for certain types of materials and effects, can only be achieved with ray tracing.
 

 

Because it mimics light’s real behavior, ray tracing also excels at producing area shadows and ambient occlusion. 

Conversely, rasterization is ray-tracing’s faster, cheaper cousin, calculating lighting with a number of approximations. Unreal Engine’s rasterization delivers 4K resolution frames in a matter of milliseconds on a fast GPU. Rasterization can closely approximate photorealism without necessarily being physically precise, and the tradeoff on speed has been well worth it for most Unreal Engine customers.

However, for live-action film and architectural visualization projects where photorealism is valued over performance, ray tracing gives an extra level of fidelity that is difficult, if not impossible, to achieve with rasterization. Until now, ray tracing was implemented only in offline rendering due to its intense computational demands. Scenes computed with ray tracing can take many minutes to hours to compute a single frame, twenty-four of which are needed to fill a single second of a film animation.

“Traditionally, ray tracing has been used in a number of ways in production pipelines,” says Juan Canada, Senior Rendering Programmer at Epic Games.“Even if a production plans to use raster rendering to achieve real time at high resolutions, they might render stills and animations with ray tracing to use as reference, to give artists a goal or comparison for raster, even for final pixels in some cases.

“If ray tracing can be done in real time, these ‘ground-truth’ reference renderings can be made much faster. This means there can be more of them, or several iterations to get things just right, instead of a handful of images that took hours or days to render.”

Canada adds that having access to ray tracing “makes the whole production pipeline easier to control, because you can get the final result more easily.”

The technology

Real-time ray tracing (previously an oxymoron) has been achieved by assembling massive amounts of computational capabilities using the latest GPUs and machine learning techniques and marrying them with rasterization techniques when ray tracing isn’t needed. The combination of hardware and software simply hasn’t been available until now, and is currently possible only because of the joint efforts of several committed companies.

Developers for Windows are familiar with Microsoft’s DirectX API, which has long been utilized by media-heavy Windows applications, including game engines, to optimize the display of graphics. Microsoft recently developed the new DXR (DirectX Raytracing) framework, an API working through the current DirectX 12.

Concurrently, NVIDIA announced NVIDIA RTX, a ray-tracing technology that runs on NVIDIA Volta architecture GPUs. NVIDIA partnered with Microsoft to enable full RTX support via the Microsoft DXR API.

Working with both Microsoft DXR and NVIDIA RTX technologies, Epic utilized this pipeline to create the Reflections real-time cinematic.

The demo

For the announcement at GDC 2018, Epic set out to create a demo that would showcase the full glory of real-time ray tracing. Epic collaborated with NVIDIA’s ray tracing experts along with ILMxLAB, Lucasfilm’s immersive entertainment division.

To make the demo, the team started with high-quality reference footage from ILMxLAB and combined forces to create the necessary models. Artists carefully recreated every detail, from tiny scratches on the stormtrooper armor to the weave pattern of Phasma’s cloak. Materials were built in a way that would have them respond naturally to light. 

Set construction and shot work was assembled directly in Unreal Engine. Epic used their own virtual production techniques to shoot the performance while the actor/director performed in Virtual Reality. 

Most of this technology already exists within Unreal Engine. But now with ray tracing, new effects can really breathe life into the scene. The dynamic and accurate reflections on Captain Phasma’s armor, a curved, mirror-like surface, were created with ray tracing.

The Reflections demo is running at 1080p Resolution at 24 frames per second in Unreal Engine. The calculations for reflections, ambient occlusion and the shadows from the textured area lights are ray traced in real time. With ray tracing, challenging effects such as soft shadows and glossy reflections come naturally, greatly improving the realism of the scenes. 

The team at Epic collaborated with NVIDIA’s engineers to utilize a number of techniques to achieve realism with ray tracing including denoising and reduction of detail in reflections. A deep-dive talk on these techniques, challenges, and trade-offs was given at GDC 2018. 
 

 

The future

For the initial release, Unreal Engine’s renderer will address processes that are the most challenging to simulate with raster rendering but are straightforward with ray tracing. 

“We’re going to start integrating some of the features shown at GDC, like using ray-tracing for accurate area shadows and a new cinematic post-process for depth of field,” says Canada, “and we’ll add new components that allow us to render other effects such as refraction, and go forward from there.” 

What does this mean for the future of filmmaking? Ray tracing technology has been used in movies and architectural visualization for many years, and has become the standard for generating photoreal images. Those images can take several hours to render with an offline renderer, but now with the ray tracing implemented in Unreal Engine, these images can be rendered in real time.

It means that directors, artists, and designers will be able to iterate faster, and try new lighting scenarios and camera angles. Ultimately, it will bring down the cost of production and make it possible to produce more interactive content.

The Reflections GDC demo ran on a DGX Station equipped with four Volta GPUs using NVIDIA RTX technology. For now, you will need similar horsepower to leverage real-time ray tracing. In the future, we can rely on Moore’s law to bring down the cost and complexity of how this technology is packaged and delivered. DXR performance is resolution-dependent, just like offline rendering, however, because Unreal Engine is still performing its own rendering, performance will be very scene-dependent.

While the Reflections demo runs on pretty heavy-duty hardware, Epic sees the demo as a first step rather than the end game. Currently, NVIDIA offers the DXR-enabled hardware used for the real-time ray tracing demo, but AMD has announced that they will make DXR drivers available at some point, too. 

“We’re going to keep working on it to make it more lightweight and accessible,” says Sebastien Miglio, Director, Enterprise Engineering at Epic. “It’s early days, but within a short period of time we expect that ray tracing will run on more common types of computers found in any studio.”

“The Reflections effort demonstrates Epic’s commitment to leading the future, not only of games, but wherever high-fidelity immersive experiences are required.” said Miglio. “Unreal Engine set the bar long ago for fidelity and performance. Now, with DXR, we’re raising that bar even higher. Customers working with Unreal Engine have the assurance that the road they travel with us leads to the future in the most direct path possible.”

Get the future now

Ray tracing in Unreal Engine might still be a few months off, but you can take advantage of real-time raster rendering right now. To help you get started, Epic has introduced Unreal Studio, a suite of tools for importing CAD data into Unreal Engine. 

Included with the Unreal Studio is unlimited use of Unreal Engine, plugins for exporting from most major CAD and modeling programs, and extensive learning materials on how to prepare, export, and import your scenes. 

Ready to try out Unreal Engine, but need help getting started? Sign up for the free Unreal Studio beta today and get access to all the export plugins, video learning materials, and more!

NVIDIA, Intel Sponsor the Unreal E3 Awards 2018

It’s hard to believe, but the 2018 Electronic Entertainment Expo is nearly upon us.  As the worldwide Unreal Engine development community gets set to showcase its offerings on the gaming industry’s biggest stage June 12-14, we’re extremely excited to partner with NVIDIA and Intel on the Unreal E3 Awards 2018.

Similar to years past, we’re partnering with NVIDIA to ensure that each winning team will receive an NVIDIA GeForce GTX 1080 Ti graphics card. 

“Our partnership with Epic on the Unreal E3 Awards aims to recognize and reward developers from around the world with our very best hardware,” said Fredrik Liljegren, Director, Systems Software, NVIDIA. “Whether it’s through this initiative with Epic, our own Indie Spotlight program or the growing GeForce community, we’re always excited to showcase Unreal Engine developers and their projects.”

In addition, we’ve joined forces with Intel to provide each category winner with a premium gaming processor, the Intel® Core™ i7-8700K Processor, brought to you by Intel’s Game Dev Program.

From Biggest Buzz to Unreal Underdog, this year’s award categories aim to highlight a wide variety of teams doing amazing things at E3. After evaluating the UE-powered games at the show, we’ll be announcing the nominees on Thursday, June 14 and revealing the winners one week later on Thursday, June 21.

Below is a breakdown of the five categories for this year’s Unreal E3 Awards:

Eye Candy
This award is given to the most visually impressive Unreal Engine game at E3 2018 and rewards the use of leading-edge graphics that push the medium of interactive entertainment forward.

Most Addictive
This award is given to the experience that we simply can’t put down. Nominees will make players forget about their surroundings and lead to fun-induced sleep deprivation.

Best Original Game
This award is given to the project with huge potential as an all-new IP. Nominees will spark interest not only through gameplay, but through original characters, worlds and the potential that is put on full display during E3 2018.

Biggest Buzz
This award is given to the project that creates the most talked about moment of E3 2018. From a major game reveal to an undeniably impressive demo or a major twist that flips the industry on its ear, this award goes to the Unreal Engine team or project that produces the most buzz.

Unreal Underdog
This award is given to the team that pushes the limits to achieve an amazing showing for their game or experience at E3. Focusing on not just the product, but the people and process behind it, this award acknowledges a team’s perseverance to make a big splash at the big show.

If your Unreal-powered project will be on display in or around the LA Convention Center during E3 2018, please put yourself on our radar by emailing e32018@unrealengine.com and letting us know about your presence at the show. 

Of course, we understand (and respect) the fact that some of you might be saving your surprises, so we will certainly keep all confidential details confidential.

Be sure to check back during the week of E3 to hear about all of the nominees and see our Unreal Engine developer interviews straight from the showroom floor! 

We look forward to seeing you E3 2018!

Futuristic Cold War Virtual Reality Comes Alive in Vertical Robot’s Red Matter

It’s easy to see where Vertical Robot took their inspiration from for their upcoming virtual reality title, Red Matter. For most people reading this, the Cold War doesn’t hit too close to home. If anything, it was something their grandparents might have talked about – if they ever mentioned it at all. A time of great political tension between the US and its allies against the former Soviet Union, it’s a jumping off point for this futuristic sci-fi setting on the distant Saturnian moon of Rhea.

A medium that is in its relative infancy for home use, virtual reality has seen steady improvement over the past few years with no small part being played by Unreal Engine 4. Red Matter, which is releasing on May 24, 2018, stands out as a highly detailed entry into the VR space that’s firmly backed up by its strong puzzle-solving mechanics and compelling narrative. As the second VR offering from Madrid-based Vertical Robot, a team that is comprised of industry veterans with credits on titles such as Castlevania: Lords Of Shadow saga, Spec Ops: The Line and Deadlight among many others, it’s clear that they’re intent on crafting quality experiences for players.

There’s no doubt that VR continues its rise in the consumer space and with that in mind, we can safely assume that we have a lot more to look forward to from enthusiastic developers like Vertical Robot. To learn more about the project and the team’s vision for VR, we took some time to chat with Vertical Robot’s Design Director Tatiana Delgado about the creation of Red Matter and how Unreal Engine 4 aided in creating their best game yet. 
 

Vertical Robot is a relatively new studio made up of a team of very experienced developers. What brought you all together to go on this adventure of creating games as an indie studio? 
 
We’ve all worked together at other studios in the past, in fact, we even coincided at several of them, so we happen to know each other really well and we know we function well together as a team. We’ve been in the industry for quite some time and we really felt like we wanted to, on the one hand, create something we could call our own and face up to new challenges on the other. The arrival of Oculus was the sign we’d all been waiting for and we plunged head-first into the exciting adventure that is VR development, committed to creating quality games that add value to VR while pushing the platform’s artistic, technical and creative boundaries. 
 
We are a very small studio, but our collective years of experience enable us to be rather straightforward in our decision-making and get right to work. Just to give you an idea, Red Matter was made with a team of just eight people. 

Red Matter is the second game under your belt and also the second VR title you’ve created. What drives Vertical Robot’s passion for developing virtual reality experiences? Are you aiming to be a VR only studio? 
 
It’s a fascinating adventure to be able to create content for a new medium. Being able to partake in the initial stages of new tech also grants you a clear advantage moving forward, once the tech has become more established. By then we’ll have fought many battles and learned from our experiences. Our intention, for now at least, is to focus on VR development. 

It’s safe to say that the shift from Samsung VR for your first game, Daedalus, to the Oculus Rift for Red Matter has given you a lot more power to work with. How did your experience creating Daedalus benefit you in taking on a project with a much bigger scope? 
 
It’s funny actually because in reality, Red Matter was our first project. However, because it had a more ambitious scope, financial support was a necessity. As we searched for this support, we didn’t want development to grind to a halt. And so, from what started as a locomotion prototype for Red Matter, Daedalus was born. Ultimately, this would allow us to experience a full production cycle of a VR game and gain valuable market insight for the medium. We created it in Unreal Engine as well, and this allowed us to test the engine’s VR tools and potential. It’s been very educational and we’re extremely satisfied seeing how well-received the game was by players and critics alike. 

Red Matter is focused on a dystopian Cold War and has a distinct sci-fi setting but how many parallels can be drawn to the actual US and Russian Cold War that seems to have inspired it? Did you draw on any events in particular? 
 
There is some basis on the real Cold War but we really wanted to distance ourselves from actual history and invent a dystopian fantasy with two completely make-believe factions. This would allow us to manipulate and create characters and plots that will surprise players. 
 
Just like in games such as Papers, Please, we leverage historical and visual references to establish a familiar setting for players, but from thereon out, we set up an alternative fictional universe. Volgravia is a fabrication completely removed from the Soviet Republic, although it does share aesthetic and ideological discourse elements with them. You could venture that Volgravia is to USSR what Game of Thrones is to Medieval Europe. 
 
In fact, concerning Volgravia, we even created a new language as well. Since one of the game’s core mechanics requires the use of a scanning device to decipher text, we didn’t want to spoil the fun for Russian players. The language might appear reminiscent of Russian or Cyrillic but in reality, it has nothing to do with them. 
 

How did the idea for Red Matter come about? There aren’t many games that take on something like the Cold War. 

We wanted to make a sci-fi game, but were well aware that it’s a genre that’s been done to death, so being an indie studio we wanted to have something that would set us apart. Inspiration came from posters for the soviet space race. There was a special, unique appeal to them, and we’re big fans of stories from the Cold War. We figured we could find our niche there. 
 
In the whole scheme of things, virtual reality is very much in its infancy and Unreal Engine has been around far longer. How intuitive did you find Unreal Engine 4 in adapting to VR over more traditional 3D or 2D type development? 

In general terms, it’s been very intuitive. In the past we’ve had the opportunity and/or necessity to work with proprietary engines on other projects, so we’re used to developing our own tools based on our particular needs. Using Unreal Engine 4 we often found that the way things are implemented coincides exactly with how we would have done so ourselves, or pretty close, and that’s made things very intuitive. The implementation of VR libraries and how the engine exposes their functionality was no exception. 

Is there any particular tool of Unreal Engine 4 that was particularly beneficial to you in development of Red Matter? What was it and why? 

Some of our team members have been using Unreal Engine for 10 years now. Over the years Epic has added more and more excellent tools to Unreal’s arsenal. The new Proxy LOD System present in Unreal 4.19, although still experimental, has proven to be extremely helpful in the optimization of backgrounds and reducing overall draw call counts in order to run Red Matter efficiently on lower spec hardware. Likewise, the polygon reduction tools and the lightmap packer built into the static mesh viewer allowed our artists to save time by reducing the amount of tedious, repetitive work.
 
One of the biggest challenges of developing for virtual reality is traversal whether it be actual movement (which can make many people queasy) or teleportation style. Is this a challenge you encountered in the development of Red Matter and how did you tackle it? 
 
Locomotion has indeed posed a challenge, as expected. We decided to offer players as broad a range of options as possible within our technical capabilities being such a small team. We’re lucky enough to have a variety of preferences within our own team, ranging from people who are highly resilient to motion sickness to others that become dizzy almost immediately, so we understand the importance of offering as many options as possible. In Red Matter there are currently three types of locomotion available: teleportation for those who easily become dizzy, dashing, in order to integrate the movement into the game world, and finally the jetpack, that allows you to perform predefined jumps in an arc while manually controlling acceleration and braking. We are also working on a smooth locomotion system that will not be available for launch but will be ready in an upcoming update.

Another major challenge for VR is intuitive controls. Where so many games make the controls difficult to work with you’ve made mention of how simple your control scheme is for users. Tell us a little bit more about that. 
 
Presence, understood as the perception of being physically ‘present’ in a non-physical world, is extremely important in Virtual Reality. For Red Matter, we wanted to improve players’ presence by making the control scheme intuitive, as well as visually representing the Oculus Touch controllers in the game. The player is physically aware of holding the controllers in their hands in the real world, so we decided to turn them into actual tools you utilize in order to interact with the virtual world. You simply use the stick to switch between tools and then pull the trigger to operate it; it’s that simple. Both elements also exist in the virtual world, integrating seamlessly into the game’s fiction. For instance, one of these tools is a claw that allows the player to grab objects, turn dials, and pass objects from one hand to another in a very natural way. We even playtested with older people who had never played a video game before and they took to the controls without any problems. 
 
Thanks for your time! Where can people go to find out more about Vertical Robot and Red Matter? 
 
You can visit our website at ​www.redmattergame.com​, follow us on our dev blog at or on social networks: Twitter: @Vertical_Robot Facebook: @verticalrobotgames Instagram: verticalrobot.

Development Branches Now Available on GitHub

We are now mirroring our internal development branches to GitHub! Most of the day-to-day development on the engine happens in these branches, organized by teams or features. We periodically merge these branches into master after testing has been conducted by our QA team, and until now this has been reflected on GitHub as a monolithic commit. You can now see live development work on new features, as it happens, and get full merge history for individual files.

All development streams are named with a “dev-” prefix (dev-editor, dev-sequencer, and so on). A few branches have already gone live, and we’ll be rolling out to more branches over the next few weeks. Bear in mind that these branches reflect work in progress; they may be buggy or not even compile, though hopefully not for long! 

Some work requires that we respect non-disclosure agreements with third parties, so certain branches may take longer to appear as we shore up the security required to respect those agreements. Our goal is to provide as much visibility to what we’re working on as we can, but bear with us as we work through the kinks.

Interested in pulling from these branches? Not setup on GitHub? Get started here!

Creating an Amazing Product Configurator in Unreal Engine Using Blueprints

One of the great things about using Unreal Engine for enterprise is that our Blueprint visual scripting system enables you to craft stunning real-time design visualizations and invaluable tools. It puts creative production power into the hands of artists and designers, which can free up people in more technical team roles to focus on the heavy lifting.

Product configurators are a popular use for real-time technology across many industries, as they allow brands to customize their product experience to suit a client’s needs in the moment. Unreal Engine’s Blueprint visual scripting tools are perfect for building an intuitive product configurator to wow your potential customers and clients.

We’ll show you just how powerful Blueprints can be for designing interactive visual experiences in our upcoming webinar on May 17th, Creating an Amazing Product Configurator Using Unreal Engine, which you can register for right here.

Designing detailed configurators in Unreal Engine gives you the flexibility to let clients and customers tailor their product experience to suit their tastes.

Join design visualization expert Paul Kind as he takes an in-depth look at creating an amazing product configurator in Unreal Engine using Blueprints. In this webinar, he will walk you step-by-step through the intricacies of using Blueprints to control geometry, materials, a HUD (Heads Up Display) that automatically updates with different configurations, and much more. After the webinar presentation, you’ll also be able to download the project and try it for yourself!

You will learn:

  • How to use Blueprints to create an engaging product configurator
  • The best design approaches to inform and inspire potential clients
  • How accessible visual scripting with Blueprint can be for non-coders
  • Ways to take your design presentations to the next level

Don’t miss out on this free online webinar! Register today!