Ncam helps deliver seamless integration of live action and real-time 3D graphics in UE4

Although it can be easy to combine live action with real-time game engine elements in the Unreal Engine, more data is often needed for complex virtual production. For high-end complex projects, it is important to not only layer the live action with keying, but to have access to detailed information about the exact camera position, lens information, and a depth map of the screen. 
Ncam offers a complete and customizable platform that enables virtual sets and graphics in real-time. At its core is their unique camera tracking solution that offers virtual and augmented graphic technology, using a special camera hardware add-on and complex software. The system uses a lightweight sensor bar attached to a camera to track natural features in the environment, allowing the camera to move freely in all locations while generating a continuous stream of extremely precise positional, rotational, and lens information. This feeds into a UE4 plugin via Ncam’s powerful SDK.
The system can be used on any type of production, from indoor or outdoor use, to mounted wire rigs or hand-held camera configurations. Ncam’s products are used worldwide and have been used in the production of Aquaman (Warner Bros.), Solo: A Star Wars Story (Walt Disney Studios), Deadpool 2 (Marvel), Game of Thrones Season 8 (HBO), Super Bowl LIII (CBS), UEFA Champions League (BT Sport), NFC Championship Game (Fox Sports), and Monday Night Football (ESPN). 

Specialized hardware for accurate tracking

At its core, Ncam relies on a camera-mounted specialized piece of hardware. This small lightweight sensor bar combines a suite of sensors. Most visible are the two stereo computer vision cameras. Not so obvious are the 12 additional sensors inside the Ncam hardware bar. These include accelerometers and gyroscopes, which together with the stereo camera pair make Ncam able to fully see the set in spatial depth, with a real-time 3D point cloud. 
The same hardware unit also interfaces with the various controls on the lens, such as a Preston follow-focus control, which means Ncam knows where the lens is, what it is looking at, what the focus and field of view are, and importantly, where everything in front of the lens is located. The props, set, actors, camera, and lensing are all mapped and understood in real time. It is an extraordinary way to allow a UE4 engine to be aware of the real world and integrate live graphical elements, characters, and sets into one seamless production, while you watch.

Gathering data for predictive movement
    
Ncam from the outset is relying on a fusion of different techniques including visual tracking, odometry, and inertial navigation system technology to solve the problem of camera tracking. However, in addition to gathering data, Ncam also provides insightful information. The software uses this data to do predictive movement and have robust redundancy. It knows where the camera was and where it thinks it is going. The software handles any loss of useful signal from the cameras. If the actor blocks one of the stereo lenses, or even both, the system will continue uninterrupted based on the remaining aggregate sensor data. 

The software integrates all this data into one useful input to UE4. For example, while the computer vision cameras could run at up to 120 fps, the other sensors run at 250 fps, and so all the various data is retimed and normalized into one coherent, stable output which is clocked to the timecode of the primary production camera. 

Some sets have very challenging lighting and Ncam has an option to run the cameras in infrared mode, for strobing or flashing-light scenes. The system is also designed to have low latency, so a camera operator can watch the composited output of the live action and the UE4 graphics as a combined shot, for much more accurate framing and blocking. It is much easier to line up the shot of the knight and the dragon, if you can see the whole scene and not just a guy in armor alone on a green soundstage.

Precise lens calibration and matching

The camera tracking resolves to six degrees of freedom: XYZ position and then three degrees of rotation. Added to this is the production camera’s lens data. In addition to focus, iris, and zoom, Ncam has to know the correct lens curvature or distortion during all possible zoom, focus, and iris adjustments for the UE4 Graphics to match perfectly together with the live action. Any wide lens clearly bends the image, producing curved lines that would otherwise be straight in the real world. All the real-time graphics have to match this frame by frame, so the lens properties are mapped on a lens serial number basis. Every lens is different, so while a production may start with a template of say a Cooke 32mm S4/i lens, Ncam provides a system for lens calibration that compensates for individual variations. 

Ncam is compatible with systems such as Arri’s Lens Data System (LDS), but those systems typically don’t give image distortion over the entire optical range of the lens. At the start of any project, productions can calibrate their own lenses with Ncam’s proprietary system of charts and tools to map the distortion and pincushioning of their lens, and then just reference them by serial number.  

In the end, the system produces stable, smooth, accurate information that can perfectly align real-time graphics with live-action material. Ncam founder Nic Hatch explains, “We spent a lot of time working to fuse the various technologies of all those different sensors, I guess that’s sort of our secret sauce and why it works so well.” 

Integrating CG elements with Real Depth

The other huge benefit of Ncam is depth understanding. When elements are combined in UE4, the engine knows where the live action is relative to the UE4 camera, thanks to Ncam’s “Real Depth”. This allows someone to be filmed walking in front or behind UE4 graphical elements or virtual sets. Without the depth information, any video can only sit like a flat card in UE4. With Ncam, as the actor walks forward on set, they walk forward in UE4, passing objects all at the correct distance. This adds enormous production value and integrates the live action and real-time graphics in a dramatically more believable way. This one feature completely changes Ncam’s use in motion graphics, explanatory news sequences, and narrative sequences.
“Game engine integration has always been very important to us,” says Hatch. “At a tradeshow in 2016 we showed I think the first prototype of this kind of live action integrated with the Unreal Engine, so we have a pretty close relationship,” The company has doubled staff in the last year and the biggest proportion of Ncam’s staff are involved in R&D. A key part of their development effort is building APIs and links into software, such as UE4, for the most efficient virtual production pipeline. “The complexity of what we are doing requires a large amount of R&D,” he adds.

Advanced real-world lighting matching with Real Light

While the primary focus has been on Ncam understanding the space in front of the camera and what the camera is doing, the company also has an advanced tool to understand the lighting of the scene. Their “Real Light” project allows for a live light probe to be in the scene and inform the UE4 engine of the changing light levels and directions. 

Real Light is designed to solve the challenge of making virtual production assets look like they are part of the real-world scene. Real Light captures real-world lighting in terms of direction, color, intensity, and HDR maps, allowing the Unreal engine to adapt to each and every lighting change. Importantly, it also understands depth and position of the light sources in the scene, so the two worlds interact correctly. This means that the digital assets can fit technically and look correctly lit, which is a major advance in live action game asset integration. 

Interested in finding out about more new technology, techniques, and best practises that are changing the game for on-set production? Head on over to our Virtual Production hub, or check out our other posts relating to broadcast.
 

Spellbreak is a unique battle royale game that combines magic, roguelike, and RPG elements

It’s impossible to deny the explosive growth of battle royale games over the past few years. Capturing the hearts of gamers worldwide, the frantic action of last-man-standing mayhem has dominated the landscape for several months. While we’ve seen many games join the excitement over the past year or so, not many have sought to truly break the mold, but Spellbreak is aiming to do just that… with magic.

Currently in development by an experienced team from Proletariat Inc, Spellbreak is a visually stunning escape into a world filled with battlemages, spells, and RPG-style gameplay. Brought to life with Unreal Engine 4, its cel-shaded aesthetic is already gorgeous even in its early pre-alpha state. Following along on the studio’s mission to put “Players First,” Proletariat shakes things up by flying in the face of the usual battle royale conventions and adding a twist that no other battle royale game has done yet. Blending together RPG and roguelike mechanics, Proletariat is working towards an experience that balances skill, strategy, and depth that is fun for all comers.

We took the time to chat with three of the fine folks at this motivated studio as we discussed their desire to do something different, their mission to put players at the forefront of everything they do, and how they’ve leveraged Unreal Engine 4 to create a unique battle royale experience they can be truly proud of.
 

Proletariat started with just five people and has grown into a team of 30. Tell us a little bit about the studio’s history and its motto to always put “Players First.”

Design Director/Co-founder Jesse Kurlancheek: We started Proletariat a bit more than six years ago after our previous company unceremoniously shut down their Boston office. The founders had often talked about starting a company over the 15 years that we’ve known each other and worked together and there wasn’t going to be a better time! 

CTO/Co-founder Dan Ogles: We’d all previously worked on games that were meant to be played with your friends as social experiences (including MMOs, Rock Band/Guitar Hero, and local co-op games) and we saw how much value there was in building communities around the game, not just for the players, but for the game and the company as a whole, and we wanted to create around that core idea.

Art Director/Co-founder Damon Iannuzzelli: Putting players first isn’t just a motto either, we put it into practice daily with how we interact with them. We’re often just hanging out in Discord, answering questions on Twitch streams, or posting (very rough) works in progress. We’re trying to be as transparent with our community as possible in ways that previous companies would never allow. I think it fosters a sense of trust from the players and at the same time keeps us accountable to them.

 
Spellbreak isn’t the team’s first venture into the PC gaming sphere, but is most undoubtedly its most ambitious! Tell us what Spellbreak is all about!

Kurlancheek: It definitely is! Spellbreak is a face-paced game where players assume the role of battlemages who wield a wide variety of spells, sorceries, and magic items. They face off against other players in a ruined fantasy landscape, level up, and learn new skills and gather artifacts to become even more powerful over the course of a match. Players can use runes to fly, blink, become invisible, and more to travel quickly and fluidly to gain the upper hand in combat. Spells and sorceries can combine with one another in both helpful and harmful ways, such as setting a poison cloud alight with a fireball or shattering an incoming fireball with a boulder of your own. 

Ogles: We’ve been hard at work nailing not only the visceral feel of magic combat while keeping the skill cap high, but still being accessible to all sorts of gamers all while leaving players with plenty of meaty decisions to chew on both in and out of the game. 

Considering the history of the team, there must be a lot of people with Unreal Engine 4 experience. How did having that background help with the development of Spellbreak? 

Ogles: For most of the team, our first experience with Unreal Engine 4 was with our prior game, Streamline. When starting Spellbreak, we already had in-depth experience with Unreal Engine’s multiplayer systems, art pipeline, and rendering. This allowed us to get the game to a playable state very, very quickly and iterate on the design and mechanics. With most of the initial team having prior Unreal experience, combined with the great documentation and training programs out there for Unreal, we have been able to ramp up new artists and engineers very quickly.

The battle royale genre feels like it’s gotten very crowded in a relatively short amount of time. How do you think Spellbreak stands out from the crowd? Can you tell us more about how it crosses into other genres like RPGs and roguelikes?

Kurlancheek: Before I get to the bulk of your question, I think it’s worth mentioning that we don’t really think of battle royale as a “genre” per se, but rather as a game mode and even within the confines of that mode, game developers are really only starting to scratch the surface and it’s very exciting to see what’s being done right now.

That said, I think the biggest way Spellbreak differentiates itself from other games is the theme and moment-to-moment gameplay. Being a battlemage who can quickly and effortlessly traverse the large landscape and ruined castles while casting powerful spells (that aren’t just “magic guns”) through a dynamic and fast-paced combat system is a completely different experience than what the market has available right now. When you combine this with our striking art style, Spellbreak makes a great initial impression to set itself apart. 

On the RPG side, the team here has always loved RPGs and MOBAs and how they let you take a character from a weakling to a powerhouse over the course of the game. We wanted to see what happened if we condensed that (often long) experience down into a 15-minute bite-sized match. By leaning into the classic RPG idea of classes, players can come in, read a one-line description or look at a set of class skills and immediately have an idea of how that class might play. On top of that, we allow players to choose two classes to combine both of their skills and spells for crazy combinations and emergent gameplay unlike most RPGs.

As for the roguelike elements, we decided early on that it was more fun to be dealt a hand and have to adapt to the changing situations in each match. While you might enjoy playing a Pyromancer, if you find a Legendary Toxic Gauntlet at the start of a match, maybe it’s worth mixing up your plans this time. By forcing players to explore the variety of skills, they often find things that they wouldn’t have otherwise and this leads to some great in-game and post-game discussion about the highs and lows. Some games you can end up with the perfect build and equipment and become a god and it just feels so good when everything falls into place like that. That said, we’re definitely looking at some limited-time and competitive modes, which tamp down the random element and let players be in more/complete control of how their characters develop. 

Have there been any specific instances where Unreal Engine has made a traditionally difficult process a little bit easier to handle? 

Ogles: Unreal Engine’s multiplayer and replication technology is top-notch. It is more robust and easier to use than any other engine we’ve worked with in the past. Newer features like the Replication Graph make it even easier to support large worlds with many players on a single server while maintaining high server framerates and quick response times. Epic’s long history of running popular multiplayer games is clear, and it’s great to be able to leverage that.

Another engine feature we rely on heavily is the Hierarchical Level of Detail (HLOD) system. Being able to group together objects into single meshes, and automatically make progressively lower-detail meshes for far view distance, is essential to making a large open-world game run at high framerates. Without this system, the large and intricate architecture in Spellbreak wouldn’t be possible.

With a range of tools in its suite, Unreal Engine 4 has a lot to offer developers. If you had to pick your single favorite tool, what would it be?

Iannuzzelli: Hard to pick just one tool, since Unreal Engine is a giant package of integrated tools that are all connected to one another. Can I pick two? The Material Editor is so powerful and flexible for creating materials as simple and as complex as an artist can imagine. The node-based approach is intuitive for visual thinkers and gives artists very accurate “what-you-see-is-what-you-get” results. Spellbreak has a ton of very custom materials for the visual style, and I’m always impressed at how easy it is to keep pushing them further with a little trial/error in the Material Editor. 

I’d also like to mention the LOD tools. It seems simple, but I can recall creating lots of manual mesh LODs in the past and having a simple interface for reduction targets and screen size LOD switching makes me very happy.

Spellbreak’s animation and art aesthetic is absolutely stunning! Did the team know the look they were going for right from the start? How, if at all, has it changed over the course of development?

Iannuzzelli: We did not start this way! Early in development, the game was actually themed differently and we intended to create something more realistic looking. About eight weeks prior to GDC 2018, we completely rebooted the gameplay and decided that the supporting aesthetic needed to be much more unique. I’ve always wanted to make a game that felt a little bit like Japanese animation from the 80s/90s, especially as it pertains to the punchiness of the hand-drawn visual FX. The art team built a visual target demo that really sold the look of an “anime-ish” (or, “faux-nime?”) adventure set in an ominous world. We were very passionate about the direction and felt that it would also resonate strongly with a cross-section of game/animation fans.

Drawing from the team’s experience, what advice would you give someone looking to jump into indie development in 2019? What advice would you give to someone learning Unreal Engine 4 for the first time?

Iannuzzelli: Creatively, the biggest challenge is to craft something unique and differentiated while finding an audience. Communicate with your audience early and often. Abide your convictions but also humbly listen to your players. They have valuable feedback and insight that will help you make your game better.

Ogles: Building your game has never been easier! The engine/tools are free, the prototyping content is cheap or free, and there are so many people using Unreal Engine now that a quick Google search will answer most questions and roadblocks. It really is amazing how democratized game development has become. 

Once you achieve 1.0 and full release, what plans do you have to support Spellbreak post-release? 

Kurlancheek: A large portion of the studio’s DNA comes from running live MMOs, so a 1.0/full release is only the beginning for Spellbreak! Speaking for myself, the most exciting time working on a game is after it’s live and we get to see what players love and what they’re less fond of, and to work with them to improve the game and unveil new challenges for them. 

Ogles: More concretely, we’ve set up a very extensible system so adding content like new spells and sorceries, more classes, crazy items, special events, and so on is very easy, and we’re hoping to hit the ground running. As for larger features, we’re big fans of social systems like guilds to really let friends play together in a way that they currently can only do in MMOs. Furthermore, we’ve talked about and prototyped a variety of different game modes beyond the current battle royale. Spellbreak’s theme and gameplay naturally lend themselves to all sorts of expansions on gameplay like different types of PvP conflicts or more co-operative modes and it’d be a shame not to explore things like that.

Development on Spellbreak is still in its early phases with several playtests coming up. Tell us how players can get on board.

Kurlancheek: We’ve got a long way to go before the game’s completely ready for release and to get there, we need help from a lot of excited players testing out the game, giving us feedback, and helping shape it into the best it can be! We’re always looking for more testers to join our (pre-)alpha! If someone’s interested, they can apply here: https://playspellbreak.com/ or if they want to hop in right away, they can also become a Founder and start playing now by purchasing a Founder’s Pack: https://www.epicgames.com/store/product/spellbreak

Also, for any content creators out there, they can sign up here: https://playspellbreak.com/creators to be kept up to date with our plans for partner programs.

Where are all the places people can go to keep up with Proletariat and Spellbreak?

Chaos Group unveils V-Ray for Unreal

As one of the world’s most popular physically-based renderers, V-Ray is used daily by top design studios, architectural firms, advertising agencies, and visual effects companies around the globe. Chaos Group also noticed the rapid adoption of Unreal Engine by many of these firms to create interactive or immersive experiences, so for them to produce V-Ray for Unreal was a logical step. Having been in beta since March, the full product launch came one week before Autodesk University 2018. 

“We’re always listening to our customers, and we’re fortunate in that we get to work with creatives in multiple industries,” says Chaos Group’s Communications Director David Tracy, who unveiled the software publically at the show. “Whether it’s architectural visualization, automotive, or visual effects, every industry has its own challenges. The one common need for any artist or designer, though, is a smooth workflow, and reliable results across all their entire toolset. That, and excellent results.”

In this case, that smooth workflow means being able to easily repurpose V-Ray scenes created in 3ds Max, Maya, Rhino, or SketchUp in Unreal Engine, without needing to learn new rendering paradigms. With V-Ray for Unreal, lights and materials are automatically converted into their real-time equivalents for UE workflows, but they maintain a smart connection to the originals—so you can continue to create full-quality ray-traced renders directly from the Unreal Editor with the same content. “With V-Ray for Unreal we wanted to create the fastest, simplest way to bring V-Ray scenes into a real-time environment, and give artists the ability to render V-Ray ray-traced images directly from Unreal,” says Tracy. “Now, artists can achieve great-looking real time and great-looking physically-based renders with a workflow that they already know.” 

Importantly, the software also introduces V-Ray Light Baking, enabling artists and designers to bake V-Ray lights (including IES) directly into Unreal with full GPU acceleration, for the highest-quality real-time illumination. This ensures that the lighting in the V-Ray rendering is well matched to the real-time experience in Unreal Engine.

“We’re excited to bring our Academy Award-winning ray-tracing technology to Unreal Engine and see what amazing content artists come up with, and make their lives a little easier in the process,” says Tracy. “Working with Epic has been great, and from a development standpoint, it helps that UE4 is an open platform. I think the combination of V-Ray and Unreal Engine is a natural fit for any studio that has V-Ray in their pipeline and is interested in using Unreal.” V-Ray for Unreal is available now. For pricing and availability, or to download a trial version, visit the Chaos Group website.

“If fidelity to V-Ray rendering and the V-Ray workflow is most important to customers, then this is a great solution for our joint customers. No one is going to match a V-Ray scene to Unreal Engine better than the creators of V-Ray,” says Pierre-Felix Breton, Technical Product Manager for Epic Games. “It’s also the only solution if you want to bring V-Ray scenes from Maya, SketchUp, and Rhino into Unreal Engine, since Unreal Studio doesn’t support reading V-Ray scene data from those tools. Unreal Engine is the only real-time engine Chaos Group supports, so this is a great endorsement.”

Now, customers can retain their investment in V-Ray knowledge as they transition to real time, while they explore what’s possible with Unreal Studio. Where V-Ray for Unreal is all about fidelity to V-Ray and V-Ray rendering, Unreal Studio is more focused on scene structure, metadata, and the ability to optimize assets for interactive experiences. 

With support for 3ds Max, Revit, and SketchUp Pro (not to mention a wide range of CAD formats), Unreal Studio is an ideal partner to V-Ray for Unreal. Its Datasmith feature set not only provides import capabilities but also data optimization tools, which can be used in parallel with V-Ray for Unreal. Along with Datasmith, it offers one-to-one ticketed support, targeted learning, and industry-relevant templates and assets. Why not download the free beta today?

What Shapes a Video Game Community?

There’s something special about video game online communities. No other medium allows you to run away from zombies, build houses out of pixelated blocks and shoot people with bazookas while simultaneously interacting with complete strangers from all around the world. Gaming has brought us thousands of unique environments to explore with other people, from vast […]

The post What Shapes a Video Game Community? appeared first on SAPPHIRE Nation – Community blog by SAPPHIRE Technology.

Music Meets Mayhem in the Rhythm Action of Soundfall

When you’re securely employed by one of the most established companies in gaming, you might raise some eyebrows if you suggest jumping out of that safety net and into indie developing freefall. This was the first question I asked of Julian Trutmann and Nick Cooper, who left their positions at Epic Games to develop Soundfall as Drastic Games.
 
Soundfall is a vibrant and stunning game built on the backbone of Unreal Engine 4. Leaning on their experience with the engine, Drastic is creating a fast-moving action game that takes the player’s own music and sets it as the soundtrack and tempo to their adventure. Syncing bass beats with gratifying gunplay isn’t a feat easily achieved, however, so Drastic had to go to considerable lengths to make it all work.
 
Debuting their game on August 7, 2018, the reaction was swift and supportive. Soundfall had people intrigued and even Drastic themselves were not prepared for how well the game would be received. Now, with a number of appearances under their belts (PAX West, EGLX, and more), they’ve launched the game into crowdfunding on Fig and successfully smashed their goal within 24 hours. With plenty more time to go, the Soundfall team has set its sights on its many stretch goals.
 
We took a moment to chat with one half of the Drastic Games team, Julian Trutmann, about the perils of going indie, the passion of creating something you love, and the power of Unreal Engine.
 

 
Drastic Games is a small studio made up of two people who both came from the fold of Epic Games itself. What motivated you to pursue indie development?
 
Over the course of our years at Epic, both of us were lucky enough to be a part of the small initial teams on several projects, such as Fortnite and Paragon. We were repeatedly blown away by what a small, talented, coordinated, and focused team could accomplish in a short amount of time. This got us wondering what we could do with a small team of our own and wanting to explore pushing the limits of small team game development.
  
Soundfall is a fast-paced blending of action/adventure with a rhythm game, unlike anything we’ve seen before. How did you come up with the idea for the game?
 
Our initial plan was to brainstorm and game jam a simple concept that we could execute in six months. Obviously, our plans changed!
 
A few ideas we had floating around included a simple rhythm game and an Ikaruga/Gradius-style space shooter. At the time, we had recently played Audioshield, so the idea of a procedural rhythm game was also fresh in our minds. The music element stuck, and the shooter element evolved to be twin-stick since the versatility would allow us to use the systems we developed in a variety of possible projects. We moved forward with these elements and did a game jam over the course of a long weekend to see if they could mesh together in an interesting way. The result turned out way better than we expected!
 
We knew we had something with incredible potential on our hands, and we didn’t want to waste it on a small quick project. From that game jam, we had the beginnings of what would eventually become Soundfall.

  
Obviously, coming from Epic you have a strong grasp of the Unreal Engine, so what can you say has been your greatest advantage coming into Soundfall with so much experience?
 
Having a lot of experience in Unreal Engine gave us the courage to take on Soundfall’s riskier elements. Audio analysis is a good example. I’m not sure we would have even considered going down that road if we didn’t already know the tools inside and out. Knowing the engine also gave us the confidence to take on other features that we don’t see as often in similar indie games, such as online co-op.
 
We haven’t seen too much of the game just yet, but what we have seen is gorgeous. What’s been your most vital Unreal Engine 4 tool bringing this vibrant world to life?
 
There’s no one tool that takes the cake here. What’s made Unreal Engine 4 so powerful for us is how multifaceted its systems are. If we absolutely had to call out one tool, it’d probably be Blueprints. Basically, anything that reacts to the music in Soundfall is a Blueprint that’s responsible for coordinating some combination of other systems, like particles, materials, and animations. Ultimately, it’s using all these tools in concert that’s responsible for the vitality in our world.

  
What have been the biggest challenges aligning rhythm alongside the fighting mechanics of an action game?
 
Since Soundfall was designed to work with any song, the biggest initial technical challenge was getting the audio analysis up and running – in particular, beat detection. Initially, we spent a while developing some audio-analysis tools ourselves. We then discovered an audio-analysis library called Essentia, which we integrated to get a vast improvement on our beat detection, as well as a lot of other data about each song that we now use for our procedural dungeons and loot.
 
Switching gears from thinking about all game actions in terms of “seconds,” to thinking about them in terms of “beats” was another major technical and design hurdle. Since we typically want actions to begin and/or end “on-beat,” there is never a simple, consistent conversion between seconds and beats. For instance, the number of milliseconds in a “one-beat” delay is going to differ based on if we’re asking right on the previous beat, or halfway to the next beat! This gets even more complicated when we consider tempo changes.
 
As far as gameplay goes, just about all of our animations, abilities, and behavior trees needed to be authored in such a way that they are “beat-aware” — any portion that should be punchy or gameplay-relevant needed to occur exactly on-beat, and robust enough to work regardless of BPM and tempo changes.

 
Obviously, a massive component of any rhythm game is the music! Have you guys composed the tracks yourself? Tell us about the creative process involved with bringing the sound alive in Soundfall!
 
We’ve worked very closely with our audio engineer, Jens Kiilstofte, to shape the tone of Soundfall. In addition to all of the game’s folie art, Jens is responsible for the killer track on our trailer.
 
On the music side, we all wanted Soundfall to work with lots of different kinds of music. Even within the team, we all have very divergent tastes in music and we think that half the fun of Soundfall will be seeing how the game reacts to different songs.
 
When it comes to sound effects, striking a balance between musical and impactful has been challenging. If weapons (like Melody’s sword and beat blaster) sound too melodic, they’re often unsatisfying to use. On the flip side, more traditional video game sword and gun sounds don’t really synergize well with the music, or add to the world’s ambiance.         
 
 
From what we’ve seen in Soundfall’s reveal trailer, the world isn’t only stunning but is brimming with life and movement! What are the hurdles that present themselves when adding so many moving pieces to your levels?
 
On top of all the rhythm-based gameplay challenges we talked about before — performance! This has been particularly important to us, since traditionally in both rhythm and twin-stick games, players want the action to feel fast and smooth at all times. In typical games, most objects in the world are static, but in Soundfall, just about every actor in the world is animating or moving to the music. One of our saving graces is that our top-down camera helps give us a reasonable limit to how many of these moving actors are going to be visible at a time, so we can be smart about which actors we need to be ticking, animating, and sending “beat” events to.
 
When a lot of slow operations occur in a single frame of a game, that frame will take longer, causing players to experience a hitch. In normal game development, we often try to distribute expensive tasks over several frames to avoid this as much as possible. Unfortunately for us in Soundfall, having most of our big actions occur on-beat means we end up forced to have a LOT of instances of many expensive operations happening at the same time! The game would be essentially unplayable if it was hitching on every beat when we expect players to perform their most important actions. We’ve had to be very smart about how much we are doing on-beat, and what operations can be moved to occur off-beat, in order to prevent hitching.  

Soundfall still has a long way to go before it’s released, so it’s safe to say you have a lot of development time in front of you. How does Unreal Engine 4 help you streamline and save time on complicated processes?
 
First off, being able to get the initial game prototype up and running very quickly was very streamlined with Unreal Engine 4. Being able to quickly get an answer to our question “will a mix of rhythm and top-down action actually be compelling?” was crucial to deciding to go down this path. So many complex systems we needed just immediately work out of the box with Unreal — physics, networking, and navmesh, just to name a few. Blueprints and behavior trees continue to make gameplay iteration very quick and allow us to easily make new music-reactive actors.

 
Based on your experience, what advice would you give to aspiring developers just starting to learn Unreal Engine 4?
 
Start very small, learning one system at a time and by modifying existing examples. Re-creating an existing simple game, an 80s arcade game perhaps, is a great way to learn and will help anyone gain an understanding of how every system and discipline work together. Definitely don’t dive straight into trying to make a 100-player shooter or MMO!
 
Where are all the places people can go to stay up-to-date on Drastic Games and Soundfall?
 
People can check out more info about Soundfall or sign up for our newsletter at www.soundfallgame.com.

We’re also currently running a crowdfunding campaign on Fig, where people can pledge or invest to become more involved with development and share in our future success!

We also post a lot on social media:

Buddy VR Pioneers A New Genre of Interactive Animation

When it comes to using animations for marketing and brand engagement, many VR film projects currently on the market focus on providing an immersive one-off experience to captivate viewers in the moment. Rather than a mere afterthought, replayability is an essential ingredient for global VFX and animation studio Redrover, who is exploring fresh ways to engage viewers on a deeper level by combining story, gameplay, and greater interactivity.
  
Buddy VR – the team’s recent VR film spinoff of its Hollywood animated blockbuster, The Nut Job – recently took home the Best VR Experience Award at the Venice International Film Festival this fall. The project is part of Redrover’s vision to create a new genre of interactive animation, and what makes Buddy VR especially unique is the way it bridges the gap between animated short films and video game experiences.
 

A virtual interactive friendship

Starring “Buddy,” the loveable blue rat from The Nut Job, this vibrant interactive animation short has you meeting and befriending the little critter in a whimsical story that balances plot and gameplay elements. “We wanted to lead the story through intimacy between the player and character,” explains Chuck Chae, Director for Buddy VR. 

Players get to know Buddy through a series of non-verbal interactions like exchanging names, petting, playing musical instruments, and more. It’s a humorous, heartwarming 16-minute interactive experience, and the response from those who have played it is overwhelmingly positive, he adds.

“Simply experiencing VR offers the player an extraordinary experience, and provides deep immersion while wearing VR equipment. However, many VR titles on the market are difficult to enjoy again once they have been played through the first time,” says Chae. “Our goal is to break away from this approach and produce titles that can maintain their replayability throughout lengthy and multiple playthroughs by combining Redrover’s IP and VR technology with interactive elements.”

Optimizing creative potential with Unreal Engine

For this project, overcoming the challenge of creating cohesive story interaction through speechless communication required that the team weave in extra layers of detail and nuance to Buddy’s facial expressions, physical actions, and eye movements. Using Unreal Engine gave the team the tools and additional programming flexibility to craft a level of real-time interactivity and realism that could foster a believable relationship-building experience between players and the furry protagonist, says Chae.

“High-quality graphics and animations are essential for creating speechless interaction, which is the highlight of our product. It was amazing enough that Unreal Engine easily fulfilled our graphical standards, but it also had unbelievable real-time functionalities, allowing us to apply desired animations, edit unnatural or incorrect aspects, and then reapply to view the results all in one sitting,” says Chae, adding that the team was able to minimize production time using real-time rendering.

Optimizing their production workflows using real-time rendering also helped free up more of the team’s time and energy for creativity. “The greatest strengths of Unreal Engine are the ability to quickly make a prototype using codeless Blueprints and the ability to create high-quality graphic productions through real-time rendering,” he says. “By minimizing the workflow of realizing the designs and animations in your head to an actual render, there can be more time to concentrate on the creative aspects.” 

Ready to get started with Unreal Engine and Unreal Studio to enhance your creativity today? Download them for free right here.

Save Big with the Unreal Engine Marketplace Fall Sale!

We are excited to announce the Unreal Engine Marketplace Fall Sale, with over 3,100 items discounted up to 90% off. 

You can start shopping at 12:00pm EST today by heading right here. The Fall Sale runs for just over one week and will end on November …