NVIDIA, Intel Sponsor the Unreal E3 Awards 2018

It’s hard to believe, but the 2018 Electronic Entertainment Expo is nearly upon us.  As the worldwide Unreal Engine development community gets set to showcase its offerings on the gaming industry’s biggest stage June 12-14, we’re extremely excited to partner with NVIDIA and Intel on the Unreal E3 Awards 2018.

Similar to years past, we’re partnering with NVIDIA to ensure that each winning team will receive an NVIDIA GeForce GTX 1080 Ti graphics card. 

“Our partnership with Epic on the Unreal E3 Awards aims to recognize and reward developers from around the world with our very best hardware,” said Fredrik Liljegren, Director, Systems Software, NVIDIA. “Whether it’s through this initiative with Epic, our own Indie Spotlight program or the growing GeForce community, we’re always excited to showcase Unreal Engine developers and their projects.”

In addition, we’ve joined forces with Intel to provide each category winner with a premium gaming processor, the Intel® Core™ i7-8700K Processor, brought to you by Intel’s Game Dev Program.

From Biggest Buzz to Unreal Underdog, this year’s award categories aim to highlight a wide variety of teams doing amazing things at E3. After evaluating the UE-powered games at the show, we’ll be announcing the nominees on Thursday, June 14 and revealing the winners one week later on Thursday, June 21.

Below is a breakdown of the five categories for this year’s Unreal E3 Awards:

Eye Candy
This award is given to the most visually impressive Unreal Engine game at E3 2018 and rewards the use of leading-edge graphics that push the medium of interactive entertainment forward.

Most Addictive
This award is given to the experience that we simply can’t put down. Nominees will make players forget about their surroundings and lead to fun-induced sleep deprivation.

Best Original Game
This award is given to the project with huge potential as an all-new IP. Nominees will spark interest not only through gameplay, but through original characters, worlds and the potential that is put on full display during E3 2018.

Biggest Buzz
This award is given to the project that creates the most talked about moment of E3 2018. From a major game reveal to an undeniably impressive demo or a major twist that flips the industry on its ear, this award goes to the Unreal Engine team or project that produces the most buzz.

Unreal Underdog
This award is given to the team that pushes the limits to achieve an amazing showing for their game or experience at E3. Focusing on not just the product, but the people and process behind it, this award acknowledges a team’s perseverance to make a big splash at the big show.

If your Unreal-powered project will be on display in or around the LA Convention Center during E3 2018, please put yourself on our radar by emailing e32018@unrealengine.com and letting us know about your presence at the show. 

Of course, we understand (and respect) the fact that some of you might be saving your surprises, so we will certainly keep all confidential details confidential.

Be sure to check back during the week of E3 to hear about all of the nominees and see our Unreal Engine developer interviews straight from the showroom floor! 

We look forward to seeing you E3 2018!

Futuristic Cold War Virtual Reality Comes Alive in Vertical Robot’s Red Matter

It’s easy to see where Vertical Robot took their inspiration from for their upcoming virtual reality title, Red Matter. For most people reading this, the Cold War doesn’t hit too close to home. If anything, it was something their grandparents might have talked about – if they ever mentioned it at all. A time of great political tension between the US and its allies against the former Soviet Union, it’s a jumping off point for this futuristic sci-fi setting on the distant Saturnian moon of Rhea.

A medium that is in its relative infancy for home use, virtual reality has seen steady improvement over the past few years with no small part being played by Unreal Engine 4. Red Matter, which is releasing on May 24, 2018, stands out as a highly detailed entry into the VR space that’s firmly backed up by its strong puzzle-solving mechanics and compelling narrative. As the second VR offering from Madrid-based Vertical Robot, a team that is comprised of industry veterans with credits on titles such as Castlevania: Lords Of Shadow saga, Spec Ops: The Line and Deadlight among many others, it’s clear that they’re intent on crafting quality experiences for players.

There’s no doubt that VR continues its rise in the consumer space and with that in mind, we can safely assume that we have a lot more to look forward to from enthusiastic developers like Vertical Robot. To learn more about the project and the team’s vision for VR, we took some time to chat with Vertical Robot’s Design Director Tatiana Delgado about the creation of Red Matter and how Unreal Engine 4 aided in creating their best game yet. 
 

Vertical Robot is a relatively new studio made up of a team of very experienced developers. What brought you all together to go on this adventure of creating games as an indie studio? 
 
We’ve all worked together at other studios in the past, in fact, we even coincided at several of them, so we happen to know each other really well and we know we function well together as a team. We’ve been in the industry for quite some time and we really felt like we wanted to, on the one hand, create something we could call our own and face up to new challenges on the other. The arrival of Oculus was the sign we’d all been waiting for and we plunged head-first into the exciting adventure that is VR development, committed to creating quality games that add value to VR while pushing the platform’s artistic, technical and creative boundaries. 
 
We are a very small studio, but our collective years of experience enable us to be rather straightforward in our decision-making and get right to work. Just to give you an idea, Red Matter was made with a team of just eight people. 

Red Matter is the second game under your belt and also the second VR title you’ve created. What drives Vertical Robot’s passion for developing virtual reality experiences? Are you aiming to be a VR only studio? 
 
It’s a fascinating adventure to be able to create content for a new medium. Being able to partake in the initial stages of new tech also grants you a clear advantage moving forward, once the tech has become more established. By then we’ll have fought many battles and learned from our experiences. Our intention, for now at least, is to focus on VR development. 

It’s safe to say that the shift from Samsung VR for your first game, Daedalus, to the Oculus Rift for Red Matter has given you a lot more power to work with. How did your experience creating Daedalus benefit you in taking on a project with a much bigger scope? 
 
It’s funny actually because in reality, Red Matter was our first project. However, because it had a more ambitious scope, financial support was a necessity. As we searched for this support, we didn’t want development to grind to a halt. And so, from what started as a locomotion prototype for Red Matter, Daedalus was born. Ultimately, this would allow us to experience a full production cycle of a VR game and gain valuable market insight for the medium. We created it in Unreal Engine as well, and this allowed us to test the engine’s VR tools and potential. It’s been very educational and we’re extremely satisfied seeing how well-received the game was by players and critics alike. 

Red Matter is focused on a dystopian Cold War and has a distinct sci-fi setting but how many parallels can be drawn to the actual US and Russian Cold War that seems to have inspired it? Did you draw on any events in particular? 
 
There is some basis on the real Cold War but we really wanted to distance ourselves from actual history and invent a dystopian fantasy with two completely make-believe factions. This would allow us to manipulate and create characters and plots that will surprise players. 
 
Just like in games such as Papers, Please, we leverage historical and visual references to establish a familiar setting for players, but from thereon out, we set up an alternative fictional universe. Volgravia is a fabrication completely removed from the Soviet Republic, although it does share aesthetic and ideological discourse elements with them. You could venture that Volgravia is to USSR what Game of Thrones is to Medieval Europe. 
 
In fact, concerning Volgravia, we even created a new language as well. Since one of the game’s core mechanics requires the use of a scanning device to decipher text, we didn’t want to spoil the fun for Russian players. The language might appear reminiscent of Russian or Cyrillic but in reality, it has nothing to do with them. 
 

How did the idea for Red Matter come about? There aren’t many games that take on something like the Cold War. 

We wanted to make a sci-fi game, but were well aware that it’s a genre that’s been done to death, so being an indie studio we wanted to have something that would set us apart. Inspiration came from posters for the soviet space race. There was a special, unique appeal to them, and we’re big fans of stories from the Cold War. We figured we could find our niche there. 
 
In the whole scheme of things, virtual reality is very much in its infancy and Unreal Engine has been around far longer. How intuitive did you find Unreal Engine 4 in adapting to VR over more traditional 3D or 2D type development? 

In general terms, it’s been very intuitive. In the past we’ve had the opportunity and/or necessity to work with proprietary engines on other projects, so we’re used to developing our own tools based on our particular needs. Using Unreal Engine 4 we often found that the way things are implemented coincides exactly with how we would have done so ourselves, or pretty close, and that’s made things very intuitive. The implementation of VR libraries and how the engine exposes their functionality was no exception. 

Is there any particular tool of Unreal Engine 4 that was particularly beneficial to you in development of Red Matter? What was it and why? 

Some of our team members have been using Unreal Engine for 10 years now. Over the years Epic has added more and more excellent tools to Unreal’s arsenal. The new Proxy LOD System present in Unreal 4.19, although still experimental, has proven to be extremely helpful in the optimization of backgrounds and reducing overall draw call counts in order to run Red Matter efficiently on lower spec hardware. Likewise, the polygon reduction tools and the lightmap packer built into the static mesh viewer allowed our artists to save time by reducing the amount of tedious, repetitive work.
 
One of the biggest challenges of developing for virtual reality is traversal whether it be actual movement (which can make many people queasy) or teleportation style. Is this a challenge you encountered in the development of Red Matter and how did you tackle it? 
 
Locomotion has indeed posed a challenge, as expected. We decided to offer players as broad a range of options as possible within our technical capabilities being such a small team. We’re lucky enough to have a variety of preferences within our own team, ranging from people who are highly resilient to motion sickness to others that become dizzy almost immediately, so we understand the importance of offering as many options as possible. In Red Matter there are currently three types of locomotion available: teleportation for those who easily become dizzy, dashing, in order to integrate the movement into the game world, and finally the jetpack, that allows you to perform predefined jumps in an arc while manually controlling acceleration and braking. We are also working on a smooth locomotion system that will not be available for launch but will be ready in an upcoming update.

Another major challenge for VR is intuitive controls. Where so many games make the controls difficult to work with you’ve made mention of how simple your control scheme is for users. Tell us a little bit more about that. 
 
Presence, understood as the perception of being physically ‘present’ in a non-physical world, is extremely important in Virtual Reality. For Red Matter, we wanted to improve players’ presence by making the control scheme intuitive, as well as visually representing the Oculus Touch controllers in the game. The player is physically aware of holding the controllers in their hands in the real world, so we decided to turn them into actual tools you utilize in order to interact with the virtual world. You simply use the stick to switch between tools and then pull the trigger to operate it; it’s that simple. Both elements also exist in the virtual world, integrating seamlessly into the game’s fiction. For instance, one of these tools is a claw that allows the player to grab objects, turn dials, and pass objects from one hand to another in a very natural way. We even playtested with older people who had never played a video game before and they took to the controls without any problems. 
 
Thanks for your time! Where can people go to find out more about Vertical Robot and Red Matter? 
 
You can visit our website at ​www.redmattergame.com​, follow us on our dev blog at or on social networks: Twitter: @Vertical_Robot Facebook: @verticalrobotgames Instagram: verticalrobot.

Development Branches Now Available on GitHub

We are now mirroring our internal development branches to GitHub! Most of the day-to-day development on the engine happens in these branches, organized by teams or features. We periodically merge these branches into master after testing has been conducted by our QA team, and until now this has been reflected on GitHub as a monolithic commit. You can now see live development work on new features, as it happens, and get full merge history for individual files.

All development streams are named with a “dev-” prefix (dev-editor, dev-sequencer, and so on). A few branches have already gone live, and we’ll be rolling out to more branches over the next few weeks. Bear in mind that these branches reflect work in progress; they may be buggy or not even compile, though hopefully not for long! 

Some work requires that we respect non-disclosure agreements with third parties, so certain branches may take longer to appear as we shore up the security required to respect those agreements. Our goal is to provide as much visibility to what we’re working on as we can, but bear with us as we work through the kinks.

Interested in pulling from these branches? Not setup on GitHub? Get started here!

Creating an Amazing Product Configurator in Unreal Engine Using Blueprints

One of the great things about using Unreal Engine for enterprise is that our Blueprint visual scripting system enables you to craft stunning real-time design visualizations and invaluable tools. It puts creative production power into the hands of artists and designers, which can free up people in more technical team roles to focus on the heavy lifting.

Product configurators are a popular use for real-time technology across many industries, as they allow brands to customize their product experience to suit a client’s needs in the moment. Unreal Engine’s Blueprint visual scripting tools are perfect for building an intuitive product configurator to wow your potential customers and clients.

We’ll show you just how powerful Blueprints can be for designing interactive visual experiences in our upcoming webinar on May 17th, Creating an Amazing Product Configurator Using Unreal Engine, which you can register for right here.

Designing detailed configurators in Unreal Engine gives you the flexibility to let clients and customers tailor their product experience to suit their tastes.

Join design visualization expert Paul Kind as he takes an in-depth look at creating an amazing product configurator in Unreal Engine using Blueprints. In this webinar, he will walk you step-by-step through the intricacies of using Blueprints to control geometry, materials, a HUD (Heads Up Display) that automatically updates with different configurations, and much more. After the webinar presentation, you’ll also be able to download the project and try it for yourself!

You will learn:

  • How to use Blueprints to create an engaging product configurator
  • The best design approaches to inform and inspire potential clients
  • How accessible visual scripting with Blueprint can be for non-coders
  • Ways to take your design presentations to the next level

Don’t miss out on this free online webinar! Register today!

Theia Interactive’s Harley Davidson AR Experience Showcases the Potential of Real-time

As more creative teams are tapping into the potential of Virtual Reality and Augmented Reality, these emerging mediums are rapidly becoming the go-to choice for companies looking to sell clients and investors on major design projects. 3D real-time visualizations immerse stakeholders in fully interactive virtual designs that make it easier to explore the potential of a project before it exists, which is why more studios like Theia Interactive are carving out a strong niche by delivering thrilling new ways to experience design.

Using Unreal Engine as its primary engine for design, Theia Interactive specializes in crafting real-time VR and AR visualizations to help business customers in architecture, manufacturing, and product design tell compelling stories about their brand. Recently, the team created a stunning photorealistic Harley Davidson app experience for Autodesk University that let attendees walk around and interact with a virtual life-sized motorcycle using AR.

Viewed through an iPad, the Harley Davidson AR demo let participants swap between a vintage rusted out “barn find” to a fully-restored version of the bike superimposed over the real-world space. Taking the experience a step further, users could use sliders to create an animated exploded view of the bike to zoom in and look at all of its internal parts in full 3D. “The impact the experience had on people attending the demo was powerful,” says Stephen Phillips, CTO of Theia Interactive.
  

The vintage Harley Davidson model for the AR experience in its original “barn find” state shows lots of rust and gritty detail.

  

Fully restored, this retro ride shines bright as the centerpiece for the interactive AR experience.

“That was really exciting, and what we’ve seen again and again with our other augmented reality projects is that people get super stoked about just using a simple mobile device and understanding there’s a virtual object somewhere in the room,” he says. “It works really well.”

Photorealistic AR in Unreal Engine

For detailed real-time AR projects, managing the scale and complexity of photorealistic design assets while maintaining visual fidelity can be challenging when you’re aiming to deliver the experience on a mobile device. In the Harley Davidson AR demo, the level of detailing found in the bike models provided by Amir Glinik was meticulously crafted and visually amazing, notes Phillips. The fact the team was able to get everything running in real-time with that level of complexity is a testament to the software and tools that made it possible.

“We were blown away by Unreal’s auto LOD system that’s in the static mesh editor,” he adds. “LODs were critical for this project, because the user decides what they want to look at on the bike. If they want to get up close to the interior screws of the engine, they have the freedom to do that, so the quality needs to be there. But for the overall app to run fast at all, you can’t really have that detail there all the time.”

Theia’s Harley Davidson AR experience let viewers explore the minute details of the vintage bike up close and personal from all angles.

By doing it in batches and assigning assets to a default group to get LODs, the team was able to retain an extremely high level of visual fidelity without putting as much strain on the system. “We started forcing everything to be an LOD and not even use the original mesh, and that’s how we got it so that on-screen at any given moment it was always a million polygons or less without a drop in quality,” says Phillips.

Unreal Studio and Datasmith have also become an important addition to Theia Interactive’s Unreal Engine pipeline across recent projects. The team used to spend days prepping asset data for initial import, but using Datasmith has accelerated that process substantially, he says.

“The speed boost from Datasmith is great. We no longer have to manually export a bunch of individual meshes and materials, and that has helped a lot,” he says. “We love it, and it definitely speeds up projects. We have one project with tens of thousands of objects, and the entire scene was like 500 million polygons. To process all of those asset locations that were coming originally from Revit and to try to do that manually would have been ridiculous, but Datasmith helped us get it from Revit to Unreal in minutes.”

AR’s potential in the design space

The raw enthusiasm that augmented reality generates makes working with the medium very satisfying, says Phillips, but it’s also having a big, positive impact on business overall, too. Real-time technology is beginning to heavily drive customer demand, he notes, as everyone has been stuck for so long depending on still renderings or computer models on-screen that don’t translate to the same level of immersion as a full 3D real-time experience. This is the same shift to “experiencing design” that has led Unreal Engine to become the number one real-time visualization tool in Architecture.

“It’s a wonderful tool for generating excitement,” he says. “When we have an interesting project to show off, getting a person at a business excited about the technology and getting them super engaged with looking at the object being created or the space being created—that’s really powerful. We’re getting great feedback from people just being more excited to see and experience this stuff rather than the renders they’re so used to.”

One of the other great things about working in Unreal Engine, says Phillips, is that it’s easy to produce an interactive and exciting AR project while also generating tons of free collateral at the same time, including 4K renders, 60 fps flythroughs, and virtual reality experiences. This makes life a lot easier and provides tremendous flexibility for creating a range of high-quality deliverables for clients without creating a ton of extra work in the process.

Getting your design data into Unreal Engine is now faster and easier than ever with Unreal Studio (which includes Datasmith). Try the free Unreal Studio beta today!