How Evasion Pushes VR Shooters Forward with Innovative Combat and High-Production Values

While there are many VR shooters, Evasion, by developer Archiact, incorporates numerous elements that make it stand out. The sci-fi space shooter offers an action-packed campaign with four playable character classes, innovative combat, full-body inverse kinematics, and high-production values coupled with online co-op gameplay. Powered by Unreal Engine 4, Evasion does all of this with a level of polish that is rare in a VR game made by an indie studio. We got a chance to interview several members from Archiact to learn more about how they were able to create one of VR’s most compelling shooters. 
 

Approaching the idea of designing a VR title, the company really thought about what made the medium unique and how they could leverage its strengths to make something gripping. Lead Game Designer Ian Rooke asserted, “The biggest difference is that in VR, you get to use your body to physically move in your space. You can dodge, duck and use all your reflexes instead of just your thumb dexterity. So developing a shooter in VR means you want to design for this gameplay. The more players get to move, the more immersive it becomes.” 

While VR introduces a heightened sense of immersion coupled with new mechanics, Rooke notes that it poses new developmental hurdles, “There are also many challenges to overcome. You are always mindful of frame-rate and camera motion to ensure players don’t get sick, and you want to try to make sure that players’ movements in game match one-to-one with their body movements. If they swing their arm, they expect that to match perfectly in game,” Rooke explains. Failing to do so can make combat feel clunky and break immersion. The lead game designer continues, “This can be tricky in situations where players are dual-wielding two controllers, but in-game, they’re holding a two-handed weapon, or in melee games, when a player slashes a solid object, nothing stops their real arm’s motion, but in-game you’d expect the blade to meet some resistance on impact.” Rooke adds, “So there’s lots of prototyping and trial and error. This is not that different than traditional console development, but it can be a longer process before you’re happy with your mechanics, and you might have to go back to the drawing board more often than you’d prefer.”

Infusing Influence

Combining time-tested gameplay with modern tech, Evasion draws inspiration from arcade classics like Galaga and Space Invaders. “It was the concept of dodging and blocking projectiles in VR that we liked. We didn’t want to simply soak up damage from instant-hit weapons. It’s really fun to navigate a hail of lasers flying your way. So we looked at old-school shooters as well as more modern bullet-hell games for inspiration,” Rooke stated, adding, “This gameplay marries well with high-intensity, fast-paced shooter combat featured in games like Doom and Destiny. The idea is to throw overwhelming odds at you while providing you with over-the-top weapons to fend off the swarms of enemies,” Rooke continued.

Players will be able to wield these over-the-top sci-fi weapons as one of four “Vanguard” classes, which are basically elite super soldiers. As Rooke notes, “You’re almost unstoppable as most enemies on their own do not provide a big challenge,” but the adage “strength lies in numbers” certainly applies here with Rooke adding, “there are so many of them and they’re relentless.” 

Block Party

In prototyping the insect-like alien enemies, known as the Optera, Archiact borrowed a page from VR shooter Space Pirate Trainer by having a few flying drones shoot projectiles at players. Rooke adds, “Then we thought it would be fun to not only dodge them, but also block them with a shield.” Thus, the inclusion of a shield became a core defensive mechanic of the game. Rooke continues, “It seemed like a natural thing to try. The loop of dodging, ducking, blocking, and shooting was simple and fun.” Rooke expands on how the gunplay and weaponry evolved from here, “As we polished the mechanic, it became more and more fun. We decided to give the player a few weapon pickups as temporary power-ups. Players could grab weapon cores out of the air, similar to our [current] power cores and health cores, each one providing a more powerful weapon with limited ammo. Once the ammo is expended, your default weapon returns. The weapon power-ups included a spread shot, burst fire, auto fire, laser, chain lightning orbs, and a slow moving nuke. This was our demo — one class with multiple weapon power-ups.” 

Stay Classy

While this prototype started with a single character, after demoing an early build, Archiact found that testers wanted different classes that would fit varying playstyles and archetypes. Rooke explains, “Some people said they wanted to be more of a support or healer class, while others still wanted to destroy everything in front of them. So we took what we liked best about the various weapons and used them as a starting point for the four classes. The spread shot turned into the Warden’s primary fire, while the nuke was nerfed down and used as his grenade launcher. The laser and default blaster inspired the Striker, while the burst fire inspired the Surgeon. And, of course, the chain lightning orbs gave birth to the Engineer class. Each class has a unique way to finish off the enemy with a Tether Lash mechanic, and each also has a unique support buff that’s applied while they’re healing their teammate [online].”

With four distinct character classes to choose from, Archiact had to ensure each of the Vanguard were fun and balanced. Rooke notes, “There’s the DPS (damage-per-second) output of each class to watch, while giving various shield sizes and health values to each class. The Warden has the most health and largest shield, and deals a ton of damage up close, but is less effective at long range. The Striker has fast and precise shots, and can strafe faster than the other classes, but her shield is the smallest, and she has the smallest health pool.”

Regardless of which class players pick, they’ll be confronted with several campaign missions chock full of enemies to overcome. Developer Archiact honed in on VR’s ability to provide player movement agency as a focal point for gameplay and challenge. “The way to succeed is to fight really hard like you would in a game of paintball. Once you get used to moving and dodging and being mindful of every projectile flying your way, it will click,” Rooke stated, adding “We made mission one exciting, but not overly challenging. Players can take their time to get used to their weapons and become accustomed to taking advantage of their charge shots and tether-lash mechanics to finish enemies off. Mastering the loop of destroying enemies and pulling in power cores to level up your weapon is key. By mission two, the action starts to get more intense. This mission is like graduation from training. If you can survive this mission, you should be ready for the rest of the campaign.” Rooke adds, “The enemies get progressively harder as the ‘elites’ are introduced in the later missions, and some boss battles add some tough spikes. With only one difficulty mode (at launch) the key is to get good at the game in the first couple of missions. Retrying them a few times is acceptable and expected until you get the hang of it.”

Adding to the immersion of the missions are the game’s destructible environments. Archiact used UE4’s integration of the Apex destruction system to incorporate this. Archiact Software Engineer Thomas Edmunds noted the benefits of this approach, “[It] not only allowed us to heavily customize how destructibles look, but also to optimize them for different platforms and LODs (levels of detail).” Edmunds added, “This was important because destructibles can be very expensive and we did not want to sacrifice the ‘cool factor’ for performance.”

Prime Performance

While Evasion features high-production values with great animations and detailed backgrounds, the road getting there wasn’t easy considering the indie developer only had five artists. This issue is compounded by the fact that the studio needed to optimize the game to meet VR’s steep performance requirements. Not only do VR games need to be rendered at a high resolution, but they need to run silky smooth, or judder can occur. This can cause motion sickness for certain players. Archiact Senior Modeler Austin Huntley elaborates, “We had to be very diligent about staying on [performance] budget. Running on the PS4 in VR at 60 FPS constant requires you to look closely at every aspect of your game in detail to cut down and minimize performance costs. You have to make trade-offs and find a lot of creative solutions to problems. Transparency is a good example. We created shields with thin faded grids to give the illusion of a transparent energy shield instead of a large plane.” 

To meet VR’s steep performance demands, Archiact had to really think outside the box. For instance, Evasion features a level with an open outdoor environment that features a lot of bullets and enemies, which can create a draw-call nightmare. To overcome this, Huntley explains, “We used a lot of mesh instancing as well as shared atlas materials to reducing the amount of both material and mesh draw calls.”

Intelligently synergizing optimization with game design was another elegant move Archiact made. Huntley elaborates, “Early on, we made targets for enemy performance and the cost of any combination of enemies on screen.” By thinking ahead in this regard, the senior modeler remarks, “This helped our enemy performance stay consistent and more predictable in any combat situation by limiting how many could be spawned based on this budget.”

The game’s visuals and immersion are enhanced due to Evasion’s use of full-body avatars. This is noteworthy considering that, with only three points of contact, many other VR games simply opt to render a virtual head and floating hands. To achieve a believable full-body, Archiact leaned on inverse kinematics (IK) by IKINEMA, but Edmunds added that “UE4’s versatile animation Blueprints allowed us to layer and blend locomotion and detail animation, such as trigger pulls with the IK model.” Considering Evasion supports traditional VR motion controllers and singular peripherals like PlayStation VR’s Aim Controller, this implementation was particularly helpful with Edmunds adding, “It also allowed us to support one-handed and two-handed animation sets for our different platforms.” 

While maintaining a high, consistent framerate is paramount to mitigating simulation sickness, some players may feel nauseous by the use of free movement. This is an undesirable effect that stems from joystick locomotion which causes the eyes to be out of sync with one’s inner ear. Thankfully, Evasion offers numerous movement methods for those who want your standard run-and-gun action and for those who have yet to get their “VR legs.” As Rooke notes, “Everybody is different and there’s no getting around that when it comes to VR. Some people have iron stomachs and some don’t. Instead of declaring that we’re catering to a specific crowd, we thought it would be best to provide robust accessibility options so everyone can feel comfortable and ‘at home’ in our game. More and more people want the authentic experience of running around in VR like they would in a traditional game, so of course we delivered a free movement option.” To ensure that this method was as friendly to stomachs as possible, Archiact employed a few tricks, “The key to making this option comfortable is to keep the camera motion constant and smooth. Strafing and reversing is slower, which is what your brain naturally expects. Most important, this helps prevent nausea,” Rooke stated. 

For those that can’t handle this free motion method at all, Archiact implemented an innovative dash-step option. “It works really well as an alternative,” Rooke says, adding, “It’s like little mini jumps forward instead of a gliding camera motion. Between these two options, most people can play the game comfortably.” As a more inventive, immersive option, the developer also incorporated a mode that allows players to jog in place. “It’s similar to free move, but requires an up and down motion from the player’s head as if they’re jogging on the spot.” This mechanic allows the inner ear to more closely align with what the eyes see and Rooke asserts, “This makes it feel like you’re actually running around in the world and further helps to reduce discomfort.” Rooke exclaims, “It’s also a fun way to get exercise.” 

Making It Unreal

As an engine for virtual-reality production, Edmunds praised UE4, stating, “Unreal Engine 4 is a great choice for VR development, since it provides you with a complete VR framework to work within, while allowing you the freedom to change things to suit your projects needs.” The software engineer continues, “Each VR platform’s subsystem is nicely contained, and totally open for changes once you hit the inevitable weird ‘edge case’ as your project progresses.”

Edmunds highlighted Blueprints coupled with the consistency and extensibility within the engine that eased development, “Having all sorts of tools integrated right in the engine makes workflows so much faster. Even the destruction assets and cloth assets have tools in the editor, which was incredibly helpful.”

The studio used Blueprints “extensively” exclaimed Software Engineer Jake Moffatt, “Many of our systems are highly customizable within a Blueprint’s default values, using UPROPERTIES to surface complex data structures that are easy for designers to use.” The software engineer added, “We also made great use of Blueprints for scripting our missions. We have many custom nodes for stringing together mission-specific events, including many that use the Blueprint Async Action pattern, which we found kept our mission scripts very intuitive to read.”

With online co-op being a major feature of the game, Archiact leaned heavily on Unreal Engine 4’s networking features, “Our team made great use of the UE4 Network Profiler tool during development to ensure that we weren’t using excessive amounts of bandwidth,” Moffatt stated.  

Considering Evasion is available across PlayStation VR, Oculus, and Steam, Edmunds noted how UE4 made the game easier to port, “Unreal Engine 4 nicely abstracts away many of the platform differences. In VR development, however, some of these differences require different gameplay systems that translate to a need for ‘un-abstracting’ certain things. Handling all the different input systems, and each platform’s own requirements for VR, was a significant challenge that was made manageable by Unreal’s subsystem framework.” 

Interested in experiencing Evasion for yourself? The game is currently on sale in celebration of this week’s Steam sale event. It’s also available on the Oculus and PlayStation stores. For more information on the game, check out www.evasionvrgame.com and follow the title on Twitter and Facebook @evasionVR.

If you would like to experiment building your own VR game, download Unreal Engine for free today.

How Survios’ Creed: Rise to Glory Revolutionizes VR Melee Combat

Survios has been one of the most successful pioneers in the VR space. The Los Angeles-based developer has an impressive resume of critically-acclaimed VR games such as Raw Data and Sprint Vector. With its most recent release, Creed: Rise to Glory, garnering a ton of praise, the studio has not only created one of the best VR boxing games, but one of the best boxing games period. We recently had the chance to interview several members from the team, and in this post, Survios explains how they were able to solve many hard VR problems while producing a knockout title.

Feeling the Friction

Even though VR can offer unparalleled levels of immersion, clipping through opponents when you’re punching through them can be an immersion breaker. It’s a difficult problem to solve; after all, your opponents aren’t really in front of you to provide friction and resistance. This is why many other VR games avoid melee mechanics and instead rely on gunplay and archery for combat. 

To overcome this issue, Survios needed to revolutionize melee for VR. Setting the stage, Lead Survios Engineer Eugene Elkin stated, “In our initial prototype, we set out two goals for ourselves: punching had to feel great and getting punched had to be impactful. We decided right away that the game would not be a straight boxing simulator, but a cinematic-inspired boxing experience. Despite a relatively compressed prototyping timeline, we were still able to create multiple gameplay iterations. The result of that investigation stage was the set of technological rules and techniques we dubbed ‘Phantom Melee Technology’.” 

Explaining how the system overcomes melee clipping issues, Elkin elaborated, “At all times, there are essentially two separate player avatars that are contextually synced/desynced. One avatar is the representation of the player’s character and is bound by in-game physics like collision, hit reactions, and knockdowns. The second avatar—codenamed ‘Phantom’—always represents the player’s true position.” This separation is quite ingenious as it allows players to punch through opponents without ever feeling like you’re awkwardly clipping through them. 

Lead Designer Valerie Allen was inspired to develop this system after reading a sci-fi manga. As Allen explains, “There was a scene in one of the Battle Angel Alita volumes that involved her brain getting overclocked. In that scene, she zipped forward to deliver a punch, only to find herself crashing to the floor because her mental projection of what she was doing was so far ahead of what her body could handle. This is largely how Phantom Melee Technology works.” 

Despite having separate avatars, combat never feels disjointed. Allen explains, “After playing around enough, players quickly start to acclimate, and rather than wasting their real-world effort punching through things, they start to treat the avatar’s arms more like their own, and thus react to the position of their virtual opponent like a real-life one.”

While Phantom Melee Tech solves one major VR issue, Survios still needed to deal with players that might try and “break” the game by constantly flailing their arms about, which is neither fun nor realistic to the sport, but may be effective. To solve this problem, Survios incorporates limited stamina. Allen elaborates on this design philosophy, “Throwing a lot of of rapid punches leads to the avatar getting tired, so the player must focus on defending until the avatar’s stamina recovers.” The lead designer added, “The more we tested and tweaked stamina tuning, the more our gameplay started to look and feel like an actual boxing match.” Those with outstanding real-life endurance may balk at the inclusion of a virtual stamina system, but Allen explains, “While the avatar may tire out more quickly than the player does, the player isn’t the one experiencing the debilitating effects of being punched in the face and gut.” 

Meaningful Matchups 

This inclusion of limited endurance also made online PvP more enjoyable. Elkin notes, “The stamina system became a very important tool to encourage players to block and defend, strategically deploy their punches wisely, and treat it like an actual boxing match.” Even though Rise to Glory is not the first VR boxing game, it is the first VR boxing game to feature online play. Early on in development, Creed wasn’t going to feature multiplayer, but Survios knew the package wouldn’t feel complete without it. Adding online PvP to a melee-focused VR game while making it feel immersive and fun is extremely difficult. Elkin elaborates, “Unlike traditional fighting games where moves and abilities are predetermined, it’s extremely hard to predict how real-life players will behave in a PvP setting.” This issue is heightened when you consider that Rise to Glory features full-body avatars. On the networking front, Multiplayer Engineer Eva Hopps added, “The biggest challenge we immediately knew we had to deal with was network lag. Since we couldn’t rely on the usual fighting game tricks to mask or compensate for it, we tweaked Phantom Melee’s fatigue-triggered slowing effect as our way of concealing lag from players.” Even though incorporating online play while creating a new revolutionary VR combat system was no small task, Hopps mentioned that, “for the most part, Unreal made this pretty easy for us.”

To ensure that the boxing felt realistic, Survios enlisted the help of professional boxers early on in development. Not only did the team heed their advice, but they signed up for boxing lessons. “To this day, we have boxing coaches come twice a week to our office for lessons, and that experience was invaluable for our designers and engineers in crafting a realistic boxing experience,” Elkin explained, elaborating, “Our marketing team also worked with Buzzfeed to have an Olympic-level boxer, Mikaela Mayer, play the Career mode on the hardest difficulty setting, and she was blown away at how similar the mechanics were to the real sport.”

Hitting Hard

To take the game’s realism to the next level and to reward players who really get into the action, Rise to Glory leverages VR’s accelerometers and motion sensors to track how hard players hit. Allen adds, “We check both the distance and speed of the player’s hand movements, and in some cases the angles as well. We tuned the values to require a reasonable amount of force to throw a punch, but not so much that players feel like they always have to punch as hard and fast as they possibly can for maximum impact.” As a result, Creed ends up being a good workout that takes “shadow boxing” to the next level. Even recovering from a knock down requires players to exert physical energy to get back into the fight. While many past boxing games would often force players to quickly tap a button to recover, Rise to Glory does something wholly unique that gives players more agency than ever before. Allen explains, “I really liked the concept of the player getting hit so hard that they experience a sort of barely-conscious tunnel vision, and that it would require physical effort to run back to their body.” Taking a page from the studios’ past VR racing game Sprint Vector, once players are knocked down for the count, they get shot out into a dark tunnel and must swing their arms to run back to their bodies. Considering you can hear the ref audibly count to 10 in the background, it provides a tremendous sense of urgency and heightens the experience in a fun way that only VR can deliver.

While most VR games opt to render just virtual hands, as it can be really hard to figure out where your elbows, torso, and feet are with only three points of contact, to overcome this issue and render players’ full bodies, Survios wrote a lot of code to create believable inverse kinematics (IK). As Elkin notes, “The body IK system has gone through several iterations here at Survios. The current system is in active development and is shared among most of our projects, but is currently a custom animation solver that determines optimal joint location from three tracked inputs.” The lead engineer added, “Luckily, Unreal makes it pretty simple to create new IK Solvers, which is extremely powerful for us.”

Considering the core Rise to Glory development team consisted roughly of nine developers, Survios has impressively been able to solve many difficult challenges with relatively little resources. Elkin attributes much of the success to the passion of his associates, “I’ve never worked on a more self-motivated team: everybody loved the project from the initial prototype and only wished we could have more time to continue working on it. While it may sound cliché, we truly were making a game that we wanted to play.” 

With this being Survios’ third Unreal Engine 4-powered VR game, Elkin also attributes much of the studio’s success to improving upon an already strong UE4 foundation, “It’s extremely important for us to reuse technology between projects. Throughout the years and over the course of several projects, we have continued building out our custom toolset in UE4, and that has significantly sped up our dev cycles on new projects. Having access and the ability to change engine source code is also invaluable.” Hopps added, “UE4 also makes it easy to move from project to product with a consistent set of tools, so nothing on our end feels like it’s changing drastically as the project evolves from prototype to retail-ready.” 

In terms of specific tools, Elkin had high praise for UE4’s debug stump allocator, “Unreal has great tools for tracking nasty memory stumps and analyzing performance.” As a multiplayer engineer, Hopps praised UE4’s Profiler and noted how “amazingly easy it is to network things in Unreal.”

Continuing the Fight

While Rise to Glory marks Survios’ fourth VR title, Elkin asserts that we’re just at the frontier of VR gaming. “I think that we’re experiencing a time similar to the pioneering days of the ’80s when game developers were exploring, experimenting, and trying basically everything for the first time,” adding, “Right now, developers just don’t know for sure what will work or not in VR; all of the traditional knowledge of game development that has been acquired over decades is just not valid most of the time in VR development. VR is a unique beast and we’re just beginning to scratch its surface–but every day we discover something new, and it’s definitely a very exciting time to be developing in VR.”

Survios recently announced Creed: Rise to Glory’s first content update. Releasing November 27, the free update will feature two new free-play and PvP opponents: Danny “Stuntman” Wheeler and Viktor Drago. Both characters are the two primary antagonists from the upcoming Creed II film. For more information on Creed: Rise to Glory, make sure to visit survios.com/creed.

If you’re interested in creating your own VR game, you can download Unreal Engine for free today.

Floods and Fires: How The Weather Channel Uses Unreal Engine to Keep You Safe

After grabbing headlines with the debut of its immersive mixed reality (MR) content, The Weather Channel (TWC) continues to break new ground with segments on storm surges and wildfires. By plugging behavioral weather data into Unreal Engine, TWC gives viewers an unprecedented true-to-life visual experience of these dangerous weather phenomena. 

In these mixed-reality segments, a presenter stands amid extreme weather situations and explains the changes in the environment as they happen. The result is a visceral sense of what it would be like to actually be there. 
 

“Immersive mixed reality allows us to take any weather story we want and put the on-camera meteorologist into those environments,” says Michael Potts, VP of Design at TWC.

Potts cites Unreal Engine as the lynchpin for these segments. “It allows us to create experiences that are hyper realistic. 

Frankly, it’s going to change the way we present the weather.”

Remaking Weather Presentation

The immersive pieces on storm surge and wildfire are the latest in a series that began in June of this year with a mixed-reality experience that had viewers watching a tornado approach TWC’s headquarters. As meteorologist Jim Cantore described the changing dangers of the escalating tornado, a fizzing light pole fell at his feet, followed by a mangled car. Each of these visual events was accompanied by textual graphics to drive home important points about the tornado’s effects.

“We wanted to launch in a very memorable and bold way,” says Mike Chesterfield, Director of Weather Presentation at TWC.. “We wanted to make a statement that we were transforming weather presentation.” The reaction was immediate, with TWC’s Twitter feed lighting up with viewer engagement, and thousands of views of the segment on YouTube within a matter of days. 
 

TWC creates such content with the idea that the more informed viewers are, the better they can prepare for, and respond to, any danger coming their way. “One of the main goals for us using this technology and storytelling in this way is to keep people safe,” says Chesterfield. “This way allows them to immediately put themselves in that situation.”

Next, the team at TWC turned their attention to showing the effects of storm surge, the flooding that results from storms. The team utilized data on the behavior of water and wind during such an event, and incorporated physics calculations to animate water, trees, and other objects in the scene. “I’m able to actually take mathematical formulas and inform a graphic,” says Potts. “That’s what the Unreal Engine allows us to do.”
 

The next project was a stunning piece on wildfire, showing the speed at which a wildfire can spread and devastation that follows. While meteorologist Stephanie Abrams explained how wildfires start and spread, flames shot up around her and animals fled from the growing fire. Seeing it happen in real time illustrates the dangers in a way that a two-dimensional map never could. 
 

The segments are recorded live-to-tape a few hours before airtime. While the meteorologist in a large green-screen room, the physical camera’s position and rotation are synchronized with the viewing angle of the CG background and pop-up textual elements in Unreal Engine. With this workflow, each take is a finished segment with no post production required.

The Future of Weather Reporting

The success of immersive mixed reality with Unreal Engine has TWC thinking into the future, where real forecast data will drive the environments. 

“I love working with this technology,” says Nick Weinmiller, Creative Director at The Weather Channel. “It’s so awesome to see everything so realistic and to be able to inform people in this way.”

“We’re at the very beginning,” says Chesterfield. “Using this technology will fundamentally change and transform weather presentation at TWC. And our goal is by 2020 to use this technology in 80% of our broadcasts.”

Want to create your own mixed reality experiences to inform and inspire your viewing public? Download the free Unreal Studio beta today, and get Unreal Engine along with import tools and a whole lot more!

Ghost Slinks From the Shadows to the Spotlight Thanks to Unreal Dev Grant

Sky Machine Studios is in the beginning chapters of developing its very first game, Ghost. Jumping into the indie-development scene fueled by their love of the industry they grew up with, this team of six from Sydney, Australia are one of the very grateful recipients of Epic’s Unreal Dev Grants.

In the demanding world of video game development, it can be hard to carve out your piece of the pie, and Sky Machine isn’t taking their good fortune for granted. Extra funding thanks to the Unreal Dev Grant is opening new avenues in their game’s development and also in their marketing and promotion. 

While the stealth genre has seen some hits and misses over the past few years, there’s no denying that a hungry fan base still exists for what could arguably be considered a grossly underserved audience. Specifically taking on the isometric stealth genre, Lead Project Director Robert Wahby and the rest of Sky Machine Studios aim to deliver an engrossing experience that benefits as much as possible from the one thing that every indie studio needs a little bit of — faith.

We caught up with the team to learn more about the project and their approach to a reimagined stealth genre.
 

Sky Machine Studios is a small team comprised of six people and Ghost is your first title. Tell us what brought the team together and what’s driving you to jump into the often daunting world of indie development.

One of the driving forces in developing Ghost is the opportunity to break into the indie scene. We are aware of the challenges that come with indie development, but nothing is worth pursuing unless there is a bit of trial. After all, that is how you learn. But most of all, we are all avid gamers and want to be part of the culture and industry that we fondly grew up in.

Ghost is a pretty ambitious game for a small team like ours. We are a close-knit team, and proud that we are able to bring Ghost to life without needing an army of programmers and artists.

Ghost is in very early stages of development and not on a lot of people’s radar just yet so please tell us a little bit about the game and its premise.

Ghost is an immersive isometric stealth game, set in the city of Anargal. You’re cast in the role of Arthur Artorias, a man stripped of his past, tortured and forced to escape into isolation. Thought to be dead, you return eight years later, a changed man, seeking answers and pursuing revenge.

In Ghost, you’ll explore a world full of mystery, eccentric characters, and compelling missions. Hide in the shadows, ready your blade and seek your revenge. You must hide, explore, and survive if you wish to last the first night of winter.

Does anyone on the team have prior experience with Unreal Engine 4? If so, how is that existing experience benefitting the team now? If not, how has the team found the learning process of such a robust engine? 

Yes, Lucas, our programmer, has had extensive experience with Unreal Engine. As for the rest of the team, we’ve become accustomed to the engine, and while there’s a bit of a learning curve at first (as expected with any piece of complicated software), it didn’t take too long to get a grasp of the engine.

One thing I must say is the level building and lighting portion of the engine is fantastic and very easy to use. Being able to quickly prototype a level has assisted us in fully fleshing out environments and script events.

The main protagonist in Ghost, Arthur, loses his entire family in an attempt on his own life and comes back eight years later to exact his revenge. What can you tell us of Arthur’s motivation? Is it more than just revenge?  

The ideas and concepts seen here revolve around falling into hell and ascending out of the muck. Arthur’s story is one of great demise and the fighting spirit that some individuals have to rise above their dilemma. It’s a narrative of growth, mystery and yes, it’s also a story of revenge. 

From an archetypal point of view, Arthur is no hero. He’s a custodian of his family’s wealth, accustomed to living an extravagant life. However, in Ghost, Arthur is cast out of his familiar world, everything he deems valuable has been stripped from him, forcing Arthur into a life of destitution, to return with a new sense of courage and conviction. Telling a tale of rebirth.

As Arthur continues his story, he will begin to notice how the world has changed in his absence. A religious militant group called the Greater Heaven has taken over the city with their tyrannical ideology. Arthur will soon discover, things are not what they appear to be.

A few months ago, Sky Machine Studios was one of the recipients of an Unreal Dev Grant. Congratulations on this prestigious honor! When you submitted your application, did you ever expect to win? Did you have any fears about submitting your work in such a manner?

Thank you! It’s pretty insane actually. At the time, we were developing a prototype build, fundamentally teaching ourselves how to work as a team, developing a workflow, figuring out how the various systems should function, etc. We submitted the prototype build in the hopes that it was good enough. You know, in the back of your mind, you’re always wondering if the project stands out. After we submitted Ghost to Epic, we, of course, continued developing and eventually turned the prototype into a much more functional game. We really revamped everything.

All in all, it was a pleasant surprise. We had no clue whether or not we would be selected, and as the months went by, our doubts increased. Then, one day, we received an email informing us that we were one of the recipients. I had to read the email a few times just to comprehend what just happened! It’s not every day you are recognized for something as special as this. 

Now that you have the grant, how much does this mean to the studio? How do you think this award will benefit not only your team but the game itself?

Winning the grant was a clear indication to me and the team that we are heading in the right direction with Ghost. The grant essentially places a spotlight on the project and not to mention a healthy boost to motivation.

The grant gives us some breathing room and allows us to be able to implement more elaborate ideas and concepts, such as new 3D assets. For example, we have a pretty cool sequence on a moving train, that may not have come to life if it wasn’t for the Dev Grant. The grant has also allowed us to push some development pipelines forward. For instance, we are currently working on character designs and investing in more advanced animations to bring these characters to life.

Along with improving the current state of the game, we plan on using a portion of the grant in getting our name out there via a marketing push. We are aware that the project is not on a lot of people’s radar at the moment, but hopefully, it will be in the coming months.        

If you were to offer words of encouragement or advice to someone thinking about submitting their own project to the Unreal Dev Grant program, what would you say?

Make sure the project has potential. It doesn’t need to be perfect, but the Unreal Engine team and the public need to see that there is something there. If the game looks too rough or it doesn’t stand out, you’re most likely not going to turn heads, especially considering the caliber of projects that are submitted. Just keep going at it and don’t be fearful of delaying the project until that potential is there.

Isometric stealth games have seen some success in recent years with games like Shadow Tactics: Blades of the Shogun and Seven: The Days Long Gone. How do you feel Ghost stands out against its peers in the genre? Were there any other games that gave you particular inspiration?

With Ghost, we are trying to take the classic staples seen within the genre; such as hiding in the shadows, keeping your feet light, as to not make a sound, and grounding the experience within an isometric perspective, a true stealth experience. I believe this perspective has become somewhat popular in recent years, due to the fact that the stealth genre has primarily been played from a first-person or third-person viewpoint. It’s a reimagining of the genre.

Ghost takes plenty of inspiration from the titans of the stealth genre, primarily from the Thief and Splinter Cell series as they are the bedrock of stealth gaming. Besides the interplay between exposing yourself and not being seen, there is a great sense of open exploration these games offer.

Ghost, much in the same way, is a sandbox experience with the spirit of exploration at its core. Ghost is all about options and is a stealth experience built from the ground up to take advantage of the isometric perspective. This open-ended design is seen through our emphases on seamless verticality and the systems we have implemented to achieve this. This is particularly displayed in multistory buildings. As the player ascends each floor, the entire level is completely animated up into existence (floor, props, walls etc) within a blink of an eye. This allows the player to explore alternative avenues and grant access to such paths as a second story balcony or windows, no matter the elevation and nothing blocking the camera. The entire system, in my opinion, works quite well.

From a graphical perspective, the team spent a lot of time working on lighting and developing a sense of claustrophobia to interior locations such as buildings. For example, the entire outside world is blacked out with a heavy focus on what is in the player’s immediate environment, whenever a player enters a building.

However, one of the stand-out features seen in Ghost is our arrow-crafting system. From crafting water arrows to extinguishing torches, to poison arrows, and even electric-powered ones, it’s a fairly robust system. Now where the system really shines is how you can combine these elements, essentially creating more complicated arrow types. For example: If you take the poison arrow and add smoke to the mix, you’ll get an area-of-effect, basically engulfing the environment with poison smoke. This is one of many examples out of the 30+ different arrow combinations seen in Ghost. The arrow-crafting system, while used for offensive and defensive playstyle, will be an integral part of solving puzzles as well.

Despite being early in your development, how has Unreal Engine 4 helped you to create your environments of contrasting dark and light (which is very important in stealth games!)? Has there been a specific tool you’ve found especially helpful?

I’d have to say the lighting system has been one of the most useful systems found inside Unreal. The flexibility to tweak every scenario, with ease, from mood to directional lighting, has allowed Ghost to have that stylized look. There is something satisfying when placing assets and 3D objects inside the engine. Things just pop out. It makes you appreciate the cumulative efforts and constant refinement the engine has gone through over the years. I fundamentally believe Ghost would have been a much different looking game if we used an alternative engine.   

When it comes to specific tools, I would have to say that Unreal’s profiling tool assisted us in tweaking and optimizing performance. Instead of manually having to check each actor in a scene, the tool allowed us to locate what was causing any performance drops in any given scenario. 

You’ve still got a healthy amount of development in front of you for Ghost. Are there any other tools in Unreal Engine 4 you’re looking forward to using as you work toward release and how do you feel they’ll aid you in the game’s development?

We’re definitely looking to explore a bit more into Sequencer for creating great in-game cinematics since Unreal provides one of the best cinematic tools on the market. Also, we want to make sure we get the best LOD practices from the tools provided, not only on polygon reduction but on instance meshes to give us that little extra performance juice.

Where can people go to find out more about Sky Machine Studios and Ghost?

You can find information on Ghost via our website and social pages across Facebook, Twitter and Instagram.

Dive Head First into the Wacky World of Iguanabee’s Headsnatchers

We’ve all seen them, those videos of crazy Japanese game shows making their players do the wackiest of tasks. Taking inspiration from such shows as Takeshi’s Castle, MXC, and AKBingo!, Chilean indie developer Inguanabee decided to make a video game based on the quirky Japanese variety show concept. 

In Headsnatchers you certainly won’t find anything as crazy (or as gross!) as the AKBingo! ‘Blowing Cockroach’ game, but you’ll find there are 25 unique arenas that allow you to do everything from using your opponent’s heads as a bowling balls to flushing their noggins down a toilet. An absolute riot to play with your friends and vividly brought to life with Unreal Engine 4.

Released into Early Access on July 24, you can hit up Steam to take a look for yourself. In the meantime, we had a chance to interview Daniel Winkler, Co-founder of Iguanabee. The Headsnatchers Lead Programmer discusses everything from inspiration to his most effective and favorite tools within the Unreal Engine 4 suite.

IguanaBee is a small but talented indie studio based out of Chile. Tell us what brought the team together and what you hope to achieve as you develop your games.

We’re hungry to make unique games. That’s the formula that brought us together. In spite of the inherent difficulties of being an indie dev team coming from a Latin country like Chile, we have been working hard and have a huge passion to deliver amazing experiences to our players. We seek to push the limits of our talents and skills with every game we make.
 
Headsnatchers appears to be strongly inspired by Japanese game shows. What can you tell us about this inspiration? Were there any shows particularly inspiring to you?

In recent years we have been traveling to Japan and we love the country. That inspired us to mix Japanese culture into our games. Indeed, Japanese game and variety shows have been a source of great inspiration for Headsnatchers, especially in terms of the ridiculous tasks they need to perform.

Keeping with this Japanese game-show inspiration, how well do you feel this translated into video game form?

Our main goal was to create a game that would be fun for four players. Japanese game shows are the epitome of fun, and taking inspiration from them opened up a lot of possibilities to create all kinds of crazy situations.  
 
When creating so many varied arenas and games, which Unreal Engine 4 tool was the most useful to you?

Well, for the levels themselves, Sequencer was of great help letting our animators produce interesting in-game intros in a comfortable way. Also, for creating the 100+ unique heads the way we desired (with physical animations), the PhAT was an extreme help.
What was the creation process like when coming up with so many different games and arenas?

We follow Chef Gusteau’s (of Pixar’s Ratatouille!) philosophy in that “anyone can cook”. We have brainstorming sessions where anyone can come up with their own ideas on how to make the game funnier. In those sessions, we received suggestions of new levels and then we work into shaping them into the form you end up playing. Even during the development, if someone comes up with an interesting and fun idea on how to improve a level, we evaluate and potentially implement it.
 
With each game having its own set of rules and logic, did you have to start from scratch on each one or were you able to use some Unreal Engine 4 tricks?

Thanks to the Blueprints tool, we were able to reuse a lot of the actors and other classes we created. So, when making a new level, we always contemplate the already-created code and Blueprints, and navigate into reusing them in a smart way whenever possible. 

Did you have a favorite tool from Unreal Engine 4? What was it and why?

The Animation Blueprint is a very complete tool that helped us to focus on what is really important, while allowing us to improve the game by adding cool stuff using its capabilities. The Animation Blueprint is by far better than the animation tool of other engines. Also, the other very useful tool was the Data Tables. Data Tables are a great way to maintain structures in an ordered way, while making it very easy to tweak values without needing to recompile.
 
In the trailer, I noticed mention of winning prizes on the Headsnatchers Show! Is this a component of online play and what can you tell us about it?

The Headsnatchers Show is a local multiplayer game mode where you are part of a TV show with a host. There, the players compete to win a “car” or “what is inside the mystery box”. Of course, the mystery box allows you to unlock really fun in-game content.

Headsnatchers has released into Early Access, meaning that there’s more on the way before you hit that 1.0 mark. What else do you have in store for players who jump into the game?

We’re currently working on adding support to more and more levels for the online mode, and improving the game by using the feedback of the people playing it.

If you could offer any piece of advice to someone jumping into Unreal Engine 4 for the first time, what would it be?

Learn about the tools that Unreal Engine 4 provides. They are very complete and strong, so don’t try to reinvent the wheel!

Where can people go to stay on top of everything Headsnatchers and IguanaBee?

We can be found on Steam, Reddit, Twitter, Facebook and of course our official website, but the most direct communication can be through our Discord.

The Future Group and The Weather Channel Create Lightning and Tornadoes with Unreal Engine

Weather can be dangerous. That’s what The Weather Channel has been telling us for years through videos and stories on their TV channel and website. We’ve been able to witness the most perilous types of weather—hurricanes, tornadoes, tsunamis—through vi…

How NEP Leveraged Unreal Engine to Deliver Standout Live Broadcast Events For the Winter Olympics, Formula 1

NEP helps its clients develop and deliver standout live broadcast events. With combined creative and IT expertise, the company delivers services that transform the way global video entertainment is created, managed and distributed. Over the past two ye…

Holospark’s Earthfall Brings Innovation to the Co-op Shooter Genre

After being kicked into high gear with the release and subsequent rabid fan base of Left 4 Dead, the four-player co-op shooter genre has seen little in the way of new games over the past few years. Bursting onto the scene in Early Access in April of 2017, Earthfall hopes to recreate and innovate on the magic Valve delivered way back in 2008.

Taking place in the not-too-distant future of 2031, players will be tasked with defending the lush environments of the Pacific Northwest against a violent alien invasion. Perhaps not as mindless as they seem at first, the alien invaders won’t go down too easy, but Earthfall will provide players with the firepower they need to mount their offense. Using the power of Unreal Engine 4’s development suite, Holospark has created enemies that smartly adapt to not only each player but the team dynamic as a whole, creating an experience like no other.

Now, coming out of Early Access and launching on PC, PlayStation 4 and Xbox One, Earthfall is poised to fill that void created by Left 4 Dead’s long absence from the gaming scene. We recently took some time to chat with Holospark CEO, Russell Williams, as the growing developer filled us in on their thoughts about working with Unreal Engine 4 and protecting the human race from alien devastation.
 

Tell us a little bit about Holospark and how this highly experienced team came together?

Holospark is an independent video game developer in the Seattle area. We have two teams, one focusing on Earthfall, our four-player co-op shooter, and a smaller team working on VR projects.

Built from a core team of experienced developers that previously worked together we broke off looking to create something new and exciting on our own. After setting the studio up and working on some ideas, we decided we all loved co-op shooters and started building Earthfall in 2016.

Over the last two years, we have expanded to our current staff of 37. Many of these developers bring their extensive background working on dozens of projects including multiple award-winning titles. Holospark also has a great relationship with some of the local schools in the area allowing us to recruit an entirely new group of developers who are immensely talented and hungry to make an impact.

Aside from it being your own backyard, what is it about the Pacific Northwest that made it the ideal setting for your alien invasion story?

The Pacific Northwest is a gorgeous, moody environment perfect for spooky woods where aliens can come out at you from every turn. It feels both open and isolated at the same time, with small towns up in the Cascade Mountain Range that are perfect for desperate holdouts, alongside industrial mining operations and wood mills for varied locales.

For us, the Pacific Northwest is iconic and the visuals immediately root you in a distinct, recognizable, mysterious world.

Speaking of the Pacific Northwest, it is a very beautiful and lush environment, to say the least. Were there any specific tools in Unreal Engine 4 that really benefited the team in creating this stunning backdrop?

The landscape and foliage systems were both used extensively in the creation of our levels. The landscape system has many features that allowed us a lot of flexibility in the creation and editing of our terrain mesh. In some cases, we sculpted terrain by hand, while in other cases, we used a third party software to create height maps. In either case, they were easily modified with the sculpting tools provided if a revision was necessary. This flexibility also extended to the painting of materials on the landscape.  
 
The foliage system was another tool with immense flexibility. It allowed us to quickly place large amounts of foliage with ease but also provided functionality that allowed us to tweak individual foliage assets when needed. Again, as revisions were needed in the development process, the tool allowed us to replace assets that are used across a map with a few easy steps.
 
In addition to providing a great workflow, both of these systems provided us with many avenues for optimizing our performance. The landscape tools offer ways to adjust LOD’ing the entire landscape or portions of it. The foliage tool provides a variety of tools to aid in optimization including distance culling by foliage type.
 
These systems were invaluable to our process, and definitely made our lives easier!

Inevitably, Earthfall is going to see some comparisons to the Left 4 Dead series. How did you use that inspiration and twist it around to make Earthfall truly unique?

First, we started this project because we were huge fans of Left 4 Dead, so we had a very strong vision in mind when we started designing the game. But when you go back and play Left 4 Dead, it’s missing 10 years of innovation in the shooter genre! So we were more guided by our memories of Left 4 Dead than the actual game. The result of that is something that is completely new and yet instantly recognizable.

Beyond the basic gameplay, we also changed the setting, moving from a present-day zombie outbreak to an alien invasion in 2031. We did this because we’re hoping to be evolving Earthfall into the future, and for that, we needed an enemy to evolve with it. While the aliens start as ravenous, mindless creatures, you’ll find out there’s more to them as you play, and we’re looking forward to telling that story as we introduce new aliens for you to fight, and new weapons to fight them with. 

Earthfall is a high-intensity action game, but if you keep your eyes open, you’ll see clues in the environments as to what’s coming in the game, and as you unlock lore items in the game, you’ll uncover the backstory of the world and the aliens.

Finally, there are lots of moments when the players are just trying to hold out and survive, and we wanted to give them some interesting tools to define and control the battlefield. Being set in the future, we have auto-turrets that can watch your six, mounted guns you can man to mow down the enemy, and deployable fences to barricade off areas and channel the aliens into kill zones. You can even upgrade the fences with propane tanks to make flaming death traps, or arc grenades to electrify them.

The alien enemies in the game are dynamically generated but how much of their behavior dynamically adapts to players’ play patterns?

We have a number of ways that the AIs modify their behavior to encourage teamwork and keep players engaged. For example, some of the AIs will intentionally focus on a player who’s straying from the group, so if you’re a lone wolf, you’ll want to keep alert! Others will focus on players that haven’t had much action in a while. Some AIs will attack a target with singular determination while others can be drawn away by a teammate. The AIs will change their aggression depending on the overall group’s progress, so if the group is tearing through a level with guns blazing, they’ll quickly draw the attention of nearby enemies. On the other hand, if a group is moving very slowly, some of the AIs might be dispatched to hunt the group and prod them along. This all ties back into the AI Director, which is constantly striving to create a steady ebb and flow of intensity for the group. 

Continuing with the enemies, not only are they terrifying to look at, they come at you in absolutely insane numbers. How did Unreal Engine 4 help you bring these aliens to life exactly how you wanted them to be?

Unreal Engine 4 comes with a number of built-in systems that we were able to leverage to get things running at a high level very quickly. This allowed us to focus on the actual AI and gameplay very early on. Blueprints, in particular, were invaluable for prototyping. 

We make extensive use of the built-in navigation system. This includes dynamic navigation mesh modification, path-finding, support for multiple agents, path filtering, and even AI movement. For the actual AI logic, we make use of Unreal’s perception, behavior tree, and Environment Query systems. These systems tie into very powerful debugging tools such as the visual debugger and gameplay debugger. This was a huge help in refining AI behavior and identifying and fixing issues that arose. We were then able to build on these systems and tools to deliver game-specific functionality.  

For animation, we use a combination of animation Blueprints and montages. These tools help us bridge the gap between raw animation and the AI system to deliver a compelling performance.

Earthfall has a diverse cast of characters. What brought these four together and how important was it to Holospark to bring that diversity to the table?

When we started thinking about our characters, we started off by thinking about Seattle archetypes and building our characters from there. We didn’t start off to make a statement as much as we were focused on telling a great story with memorable characters. It’ll be nice when the day comes that having different races and ethnicities isn’t exceptional and that people view the story on its own merits.

3D printed weapons! While that alone is a pretty fun mechanic, how do you develop that in-game and are there any surprises awaiting players as they advance through the campaign?

From the beginning, we knew that we wanted to set the game in the future so we could give new capabilities to the players, and the 3D printer was a natural fit. It gives us some natural objectives in the game world (“get the power back on to get the printer working to print those sweet weapons!”) and good checkpoints to resupply your weapons.

In the game, you’re exposed to the printers as just an expected part of the world, but you’ll find some info into why they work the way they do. We’re looking forward to expanding their operation in the future!

What advice you would give as experienced developers to someone who is in the beginning stages of learning Unreal Engine?

Unreal is amazingly accessible. First of all, it’s free, so there’s no cost barrier to jumping in and getting started! Second, it comes with great tutorials that take you through the basics, and sample games that will really show you how everything works in a functional and practical manner. Beyond what comes with the engine, there’s a staggering amount of information on the web to help you learn Unreal Engine. With so many developers using the engine, there are countless “how to” videos on Youtube on almost every aspect of the engine and tons of in-depth articles to read on sites like 80.lvl. It has never been easier to jump into game development.

Where are all the places people can go to learn more about Earthfall?

You can go to www.earthfall.com, or follow us on Twitter and Facebook to stay up to date!

EDITOR’S NOTE: Unreal Engine caught up with Holospark during E3 2018 to learn more about Earthfall. You can watch the video interview below.

 

Unreal Engine Drives Monster Puppet for The Mill and Monster.com

When employment website Monster.com needed a new commercial spot, they hired The Mill, an international VFX and creative house with countless commercials to their credit. The spot features a giant, purple, hairy monster who rescues an unhappy employee and carries her, King-Kong style, to a new employment situation. The 1:30 spot, called Opportunity Roars, garnered numerous accolades for The Mill including a Cannes Lion Award for Visual Effects.
 

Award-winning commercial for Monster.com, “Opportunity Roars”

 
But there was little time for The Mill to rest on their laurels. After the success of that spot, Monster.com came back to the production house with a request for more than two dozen 15-second animated spots featuring the purple monster. 

The only problem was the turnaround time—a mere three weeks.

While The Mill used traditional techniques to produce the original commercial, this workflow wasn’t an option for such a short time frame. “What do you do with that?” says Boo Wong, Group Director of Emerging Technology at The Mill. “We realized there was no way to go down the traditional route.”
 

A faster way to animate: real-time motion capture and rendering 

That’s when The Mill came up with a clever solution: they would use a Leap Motion system to control the monster with finger motions, a kind of virtual puppeteering. The motions would drive the rig and generate real-time finished output of the monster, purple fur and all, with Unreal Engine.

Using the Leap Motion system to drive the rig  
 

“We pitched it to Monster.com, and they were blown away by the capabilities,” says Jeffrey Dates, Creative Director at The Mill. 

The team quickly put together a system and brought in the agency and clients for a live puppeteering and recording session. “They would give us notes on animation in real time,” says Dates. “As fast as they could say it, we then would make these adjustments and re-perform the next take.”

Performing live takes with finger motions and Unreal Engine
 
For the output, The Mill set up real-time post effects in Unreal Engine that encompassed everything they would typically do to finish a shot. As a result, the animation recorded in real time was ready for use without further processing. “The entire animation pipeline was happening in that span of just a few minutes,” says Dates.

“The client walked away with hours of finished quality work, basically final pixels,” says Joji Tsuruga, Real-time Supervisor at The Mill. The monster animation was used in videos for social media such as Touchdown Dance and Meet Your Purple Fuzzy Career Coach on Monster.com’s Facebook page.

The process set a new bar for character animation output. “It’s really unheard of in animation to get multiple takes of a performance,” says Wong. “For an editor to basically walk away with selects, was really groundbreaking.”

Exploring the future with real-time rendering

Inspired by the success of the Monster.com project, The Mill sees real-time animation as an important new paradigm. “Integrating game engines into your production workflow is critical,” says Wong. “It’s essential in storytelling today.”

A few of the many monster motions generated in real time with Unreal Engine
 
They also recognize the practical aspect of real-time rendering with Unreal Engine for short turnaround times. “This project answers the question, how can we generate a lot of animation cost-effectively for social media?” says Dates. “I’m not rendering, I’m not watching it render.

“Now, it’s more like experimenting. I want to do it and do it fast and have fun doing it.”

Making your own animation magic

Want to try out real-time rendering for your own projects? Join the Unreal Studio beta today and start creating!
 

Drive Studio Uses Unreal to Score Big for FOX Sports’ 2018 FIFA World Cup Broadcast

If you’ve been tuning in to the 2018 FIFA World Cup on FOX Sports, you’ve undoubtedly noticed the network’s stunning on-air graphics package featuring Moscow’s iconic Red Square and St. Basil’s Cathedral. Southern California-based Drive Studio leverage…