Ncam helps deliver seamless integration of live action and real-time 3D graphics in UE4

Although it can be easy to combine live action with real-time game engine elements in the Unreal Engine, more data is often needed for complex virtual production. For high-end complex projects, it is important to not only layer the live action with keying, but to have access to detailed information about the exact camera position, lens information, and a depth map of the screen. 
Ncam offers a complete and customizable platform that enables virtual sets and graphics in real-time. At its core is their unique camera tracking solution that offers virtual and augmented graphic technology, using a special camera hardware add-on and complex software. The system uses a lightweight sensor bar attached to a camera to track natural features in the environment, allowing the camera to move freely in all locations while generating a continuous stream of extremely precise positional, rotational, and lens information. This feeds into a UE4 plugin via Ncam’s powerful SDK.
The system can be used on any type of production, from indoor or outdoor use, to mounted wire rigs or hand-held camera configurations. Ncam’s products are used worldwide and have been used in the production of Aquaman (Warner Bros.), Solo: A Star Wars Story (Walt Disney Studios), Deadpool 2 (Marvel), Game of Thrones Season 8 (HBO), Super Bowl LIII (CBS), UEFA Champions League (BT Sport), NFC Championship Game (Fox Sports), and Monday Night Football (ESPN). 

Specialized hardware for accurate tracking

At its core, Ncam relies on a camera-mounted specialized piece of hardware. This small lightweight sensor bar combines a suite of sensors. Most visible are the two stereo computer vision cameras. Not so obvious are the 12 additional sensors inside the Ncam hardware bar. These include accelerometers and gyroscopes, which together with the stereo camera pair make Ncam able to fully see the set in spatial depth, with a real-time 3D point cloud. 
The same hardware unit also interfaces with the various controls on the lens, such as a Preston follow-focus control, which means Ncam knows where the lens is, what it is looking at, what the focus and field of view are, and importantly, where everything in front of the lens is located. The props, set, actors, camera, and lensing are all mapped and understood in real time. It is an extraordinary way to allow a UE4 engine to be aware of the real world and integrate live graphical elements, characters, and sets into one seamless production, while you watch.

Gathering data for predictive movement
    
Ncam from the outset is relying on a fusion of different techniques including visual tracking, odometry, and inertial navigation system technology to solve the problem of camera tracking. However, in addition to gathering data, Ncam also provides insightful information. The software uses this data to do predictive movement and have robust redundancy. It knows where the camera was and where it thinks it is going. The software handles any loss of useful signal from the cameras. If the actor blocks one of the stereo lenses, or even both, the system will continue uninterrupted based on the remaining aggregate sensor data. 

The software integrates all this data into one useful input to UE4. For example, while the computer vision cameras could run at up to 120 fps, the other sensors run at 250 fps, and so all the various data is retimed and normalized into one coherent, stable output which is clocked to the timecode of the primary production camera. 

Some sets have very challenging lighting and Ncam has an option to run the cameras in infrared mode, for strobing or flashing-light scenes. The system is also designed to have low latency, so a camera operator can watch the composited output of the live action and the UE4 graphics as a combined shot, for much more accurate framing and blocking. It is much easier to line up the shot of the knight and the dragon, if you can see the whole scene and not just a guy in armor alone on a green soundstage.

Precise lens calibration and matching

The camera tracking resolves to six degrees of freedom: XYZ position and then three degrees of rotation. Added to this is the production camera’s lens data. In addition to focus, iris, and zoom, Ncam has to know the correct lens curvature or distortion during all possible zoom, focus, and iris adjustments for the UE4 Graphics to match perfectly together with the live action. Any wide lens clearly bends the image, producing curved lines that would otherwise be straight in the real world. All the real-time graphics have to match this frame by frame, so the lens properties are mapped on a lens serial number basis. Every lens is different, so while a production may start with a template of say a Cooke 32mm S4/i lens, Ncam provides a system for lens calibration that compensates for individual variations. 

Ncam is compatible with systems such as Arri’s Lens Data System (LDS), but those systems typically don’t give image distortion over the entire optical range of the lens. At the start of any project, productions can calibrate their own lenses with Ncam’s proprietary system of charts and tools to map the distortion and pincushioning of their lens, and then just reference them by serial number.  

In the end, the system produces stable, smooth, accurate information that can perfectly align real-time graphics with live-action material. Ncam founder Nic Hatch explains, “We spent a lot of time working to fuse the various technologies of all those different sensors, I guess that’s sort of our secret sauce and why it works so well.” 

Integrating CG elements with Real Depth

The other huge benefit of Ncam is depth understanding. When elements are combined in UE4, the engine knows where the live action is relative to the UE4 camera, thanks to Ncam’s “Real Depth”. This allows someone to be filmed walking in front or behind UE4 graphical elements or virtual sets. Without the depth information, any video can only sit like a flat card in UE4. With Ncam, as the actor walks forward on set, they walk forward in UE4, passing objects all at the correct distance. This adds enormous production value and integrates the live action and real-time graphics in a dramatically more believable way. This one feature completely changes Ncam’s use in motion graphics, explanatory news sequences, and narrative sequences.
“Game engine integration has always been very important to us,” says Hatch. “At a tradeshow in 2016 we showed I think the first prototype of this kind of live action integrated with the Unreal Engine, so we have a pretty close relationship,” The company has doubled staff in the last year and the biggest proportion of Ncam’s staff are involved in R&D. A key part of their development effort is building APIs and links into software, such as UE4, for the most efficient virtual production pipeline. “The complexity of what we are doing requires a large amount of R&D,” he adds.

Advanced real-world lighting matching with Real Light

While the primary focus has been on Ncam understanding the space in front of the camera and what the camera is doing, the company also has an advanced tool to understand the lighting of the scene. Their “Real Light” project allows for a live light probe to be in the scene and inform the UE4 engine of the changing light levels and directions. 

Real Light is designed to solve the challenge of making virtual production assets look like they are part of the real-world scene. Real Light captures real-world lighting in terms of direction, color, intensity, and HDR maps, allowing the Unreal engine to adapt to each and every lighting change. Importantly, it also understands depth and position of the light sources in the scene, so the two worlds interact correctly. This means that the digital assets can fit technically and look correctly lit, which is a major advance in live action game asset integration. 

Interested in finding out about more new technology, techniques, and best practises that are changing the game for on-set production? Head on over to our Virtual Production hub, or check out our other posts relating to broadcast.
 

Virtual Production: How game engines are taking previs to a new level

Shooting live-action movies is costly. Having to reshoot them because the story doesn’t quite work the first time can be catastrophic to budgets and schedules. Not reshooting them and putting up with a final product that doesn’t live up to expectations…

A Look at VR in Medical and Nursing Student Training

AUTHOR: Steven Anbro Virtual reality has enormous potential for both current and future research applications in science and medicine. The social sciences in particular stand to benefit from using virtual reality technology to conduct more accurate research, better measure specific phenomena, and collect valid, reportable data—all common challenges in various disciplines in the field. We in the Behavior Analysis Program

The post A Look at VR in Medical and Nursing Student Training appeared first on VIVE Blog.

Breaking With Tradition: How adidas Improved Their Workflow With VR

It begins in a place like this, but not like this. Above, a virtual simulation of the auditorium, generated by The Wild in VIVE VR. It starts off in an auditorium, much like the above. Teams trickle into forward-facing seats, homed in on a soon-to-be-occupied podium and stage with a projection looming overhead. On the periphery: a barrage of posters

The post Breaking With Tradition: How adidas Improved Their Workflow With VR appeared first on VIVE Blog.

Technology Sneak Peek: Advances in Real-Time Ray Tracing

At SIGGRAPH 2018, we saw another advance toward real-time ray tracing with Unreal Engine. A joint presentation by Epic Games, NVIDIA, and Porsche resulted in “Speed of Light”, a real-time cinematic showcasing the current state of this developing techno…

Buddy VR Pioneers A New Genre of Interactive Animation

When it comes to using animations for marketing and brand engagement, many VR film projects currently on the market focus on providing an immersive one-off experience to captivate viewers in the moment. Rather than a mere afterthought, replayability is an essential ingredient for global VFX and animation studio Redrover, who is exploring fresh ways to engage viewers on a deeper level by combining story, gameplay, and greater interactivity.
  
Buddy VR – the team’s recent VR film spinoff of its Hollywood animated blockbuster, The Nut Job – recently took home the Best VR Experience Award at the Venice International Film Festival this fall. The project is part of Redrover’s vision to create a new genre of interactive animation, and what makes Buddy VR especially unique is the way it bridges the gap between animated short films and video game experiences.
 

A virtual interactive friendship

Starring “Buddy,” the loveable blue rat from The Nut Job, this vibrant interactive animation short has you meeting and befriending the little critter in a whimsical story that balances plot and gameplay elements. “We wanted to lead the story through intimacy between the player and character,” explains Chuck Chae, Director for Buddy VR. 

Players get to know Buddy through a series of non-verbal interactions like exchanging names, petting, playing musical instruments, and more. It’s a humorous, heartwarming 16-minute interactive experience, and the response from those who have played it is overwhelmingly positive, he adds.

“Simply experiencing VR offers the player an extraordinary experience, and provides deep immersion while wearing VR equipment. However, many VR titles on the market are difficult to enjoy again once they have been played through the first time,” says Chae. “Our goal is to break away from this approach and produce titles that can maintain their replayability throughout lengthy and multiple playthroughs by combining Redrover’s IP and VR technology with interactive elements.”

Optimizing creative potential with Unreal Engine

For this project, overcoming the challenge of creating cohesive story interaction through speechless communication required that the team weave in extra layers of detail and nuance to Buddy’s facial expressions, physical actions, and eye movements. Using Unreal Engine gave the team the tools and additional programming flexibility to craft a level of real-time interactivity and realism that could foster a believable relationship-building experience between players and the furry protagonist, says Chae.

“High-quality graphics and animations are essential for creating speechless interaction, which is the highlight of our product. It was amazing enough that Unreal Engine easily fulfilled our graphical standards, but it also had unbelievable real-time functionalities, allowing us to apply desired animations, edit unnatural or incorrect aspects, and then reapply to view the results all in one sitting,” says Chae, adding that the team was able to minimize production time using real-time rendering.

Optimizing their production workflows using real-time rendering also helped free up more of the team’s time and energy for creativity. “The greatest strengths of Unreal Engine are the ability to quickly make a prototype using codeless Blueprints and the ability to create high-quality graphic productions through real-time rendering,” he says. “By minimizing the workflow of realizing the designs and animations in your head to an actual render, there can be more time to concentrate on the creative aspects.” 

Ready to get started with Unreal Engine and Unreal Studio to enhance your creativity today? Download them for free right here.

Upgrading Quality within a Fast Turnaround: “ICI Laflaque” Gets an Unreal Facelift

Vox Populi Productions, the team behind award-winning political satire show ICI Laflaque, is no stranger to breakneck speeds in production. The half-hour animated show, much of it newly produced each week based on recent news, has had a 7-day turnaroun…

Unreal Engine Wins Technology & Engineering Emmy® for Animation Production

The National Academy of Television Arts and Sciences has awarded Epic Games with the first Technology and Engineering Emmy for Unreal Engine in the 2017-2018 category, “3D Engine Software for the Production of Animation.” We couldn’t be more thrilled w…

Architectural Design Data: The Journey from Revit to Unreal Studio Webinar

Today’s architectural designers are turning to bold new approaches for crafting compelling design visualizations. With clients wanting more interactivity and flexible options for design review, it makes sense that real-time solutions are becoming the f…

HTC VIVE Announces Fourth Batch of Companies Selected for Vive X Accelerator Program

VIVE X, HTC VIVE’s global AR/VR accelerator, today announced the next group of companies selected to join the prominent program. Eighteen new start-ups from across the world will join Vive X programs in the San Francisco, London, Taipei, Shenzhen, Beijing, and Tel Aviv offices.   Vive X has proven to be one of the most consistent and active investors in

The post HTC VIVE Announces Fourth Batch of Companies Selected for Vive X Accelerator Program appeared first on VIVE Blog.