AI in the Fast Lane: Automotive Tech Accelerates at GTC 2019

From start to finish, this week’s GPU Technology Conference was buzzing with automotive innovation. Sleek vehicles greeted attendees at the front door of the San Jose Convention Center. Autonomous vehicles lined up for potential future passengers inside the exhibit hall. And self-driving cars seamlessly maneuvered around pedestrians and obstacles on the roads behind the conference Read article >

The post AI in the Fast Lane: Automotive Tech Accelerates at GTC 2019 appeared first on The Official NVIDIA Blog.

Shift Happens: Virtual Ride and DRIVE Gears Up GTC Attendees

The AI car of tomorrow opened its doors for a virtual spin this week at the NVIDIA GPU Technology Conference. The DRIVE AutoPilot simulator experience on the GTC show floor combines the latest NVIDIA DRIVE technologies — DRIVE AP2X, DRIVE Constellation and DRIVE Sim — to show how AI is transforming the way we move. Read article >

The post Shift Happens: Virtual Ride and DRIVE Gears Up GTC Attendees appeared first on The Official NVIDIA Blog.

New Ability to Update HD Maps Lets DRIVE Mapping Chart Safer Course for Autonomous Driving

Sensors and onboard computers help self-driving cars see, plan and act. With NVIDIA DRIVE Mapping software, they can also make sure vehicles stay on the right path. DRIVE Mapping allows vehicles to navigate anywhere in the world. It uses maps from partners such as Baidu, HERE, TomTom, NavInfo and Zenrin to localize vehicles to high-definition Read article >

The post New Ability to Update HD Maps Lets DRIVE Mapping Chart Safer Course for Autonomous Driving appeared first on The Official NVIDIA Blog.

The Test Fleet of the Future Is Virtual: DRIVE Constellation Now Available

NVIDIA DRIVE Constellation is bringing autonomous vehicle test fleets to the cloud. At the GPU Technology Conference, NVIDIA founder and CEO Jensen Huang announced that the NVIDIA DRIVE Constellation simulation platform is now available. Toyota’s research and development arm, the Toyota Research Institute-Advanced Development (TRI-AD), will leverage DRIVE Constellation as part of an expanded partnership Read article >

The post The Test Fleet of the Future Is Virtual: DRIVE Constellation Now Available appeared first on The Official NVIDIA Blog.

NVIDIA Announces DRIVE AP2X – World’s Most Complete Level 2+ Autonomous Vehicle Platform

Today at the GPU Technology Conference, NVIDIA founder and CEO Jensen Huang announced NVIDIA DRIVE AP2X — a complete Level 2+ automated driving solution encompassing DRIVE AutoPilot software, DRIVE AGX and DRIVE validation tools. DRIVE AP2X incorporates DRIVE AV autonomous driving software and DRIVE IX intelligent cockpit experience. Each runs on the high-performance, energy-efficient NVIDIA Read article >

The post NVIDIA Announces DRIVE AP2X – World’s Most Complete Level 2+ Autonomous Vehicle Platform appeared first on The Official NVIDIA Blog.

Real-time manufacturing and the future of automotive sales

Here at Epic Games, we’ve seen first-hand how real-time technology is revolutionizing the field of manufacturing through intuitive design and workflows. The enthusiastic participants at events like Build: Munich ’18 for Automotive have told us how they use real-time processes now (and how they plan to use them in the future), but we wanted to dig deeper to get a pulse on what’s happening in the industry as a whole.

To that end, we commissioned leading global analyst Forrester Consulting to conduct an independent study of real-time engine technology decision makers, 35% of which identified themselves as coming from the manufacturing, automotive, and aerospace sectors. We’ve gathered their feedback in a new report specifically about these findings, which reveals some interesting trends for these fields. 
According to Forrester’s survey results, the overwhelming majority of respondents (94%) feel that real-time rendering is vital to reducing design errors, while 92% feel that immersive technology is important to the design process.

For many companies, the benefits go beyond better design—real-time technology just makes fiscal sense. A key point for 81% of firms is the driving of productivity through real-time rendering to improve their bottom line.

In the automotive space, the report highlights some of the ways that Audi, ever-mindful of advancement through technology, is using real-time rendering with Unreal Engine to give their customers a smooth, enjoyable shopping experience. To that end, they’ve developed a product visualization system for dealerships to show customers their desired car with all the chosen options, even if the vehicle isn’t physically there. The system is currently deployed at more than 1,000 Audi showrooms worldwide, with more to come.
“We want to raise the bar,” says Thomas Zuchtriegel of Audi Business Innovation Gmbh. “Beyond the digital retail use case, we intend to use Unreal Engine within our website for the car configurator—to enable the user to see each car from any perspective, and to delight them with an amazing interactive experience.”

You can read the full report: Pushing the Boundaries of Possibility in Manufacturing on our website. 

Wondering what real-time technology can do for your manufacturing pipeline? Download the free beta of Unreal Studio, which includes tools for importing and organizing CAD data in Unreal Engine.

GTC 2019 Sets the Pace for Self-Driving Innovation

With the development of autonomous driving technology moving at the speed of light, it can be hard to stay current on what’s at the cutting edge. But at the GPU Technology Conference, you’ll find the latest in AI technology at every turn, making it the crucial pit stop for anyone interested in the future of Read article >

The post GTC 2019 Sets the Pace for Self-Driving Innovation appeared first on The Official NVIDIA Blog.

Chaos Group unveils V-Ray for Unreal

As one of the world’s most popular physically-based renderers, V-Ray is used daily by top design studios, architectural firms, advertising agencies, and visual effects companies around the globe. Chaos Group also noticed the rapid adoption of Unreal Engine by many of these firms to create interactive or immersive experiences, so for them to produce V-Ray for Unreal was a logical step. Having been in beta since March, the full product launch came one week before Autodesk University 2018. 

“We’re always listening to our customers, and we’re fortunate in that we get to work with creatives in multiple industries,” says Chaos Group’s Communications Director David Tracy, who unveiled the software publically at the show. “Whether it’s architectural visualization, automotive, or visual effects, every industry has its own challenges. The one common need for any artist or designer, though, is a smooth workflow, and reliable results across all their entire toolset. That, and excellent results.”

In this case, that smooth workflow means being able to easily repurpose V-Ray scenes created in 3ds Max, Maya, Rhino, or SketchUp in Unreal Engine, without needing to learn new rendering paradigms. With V-Ray for Unreal, lights and materials are automatically converted into their real-time equivalents for UE workflows, but they maintain a smart connection to the originals—so you can continue to create full-quality ray-traced renders directly from the Unreal Editor with the same content. “With V-Ray for Unreal we wanted to create the fastest, simplest way to bring V-Ray scenes into a real-time environment, and give artists the ability to render V-Ray ray-traced images directly from Unreal,” says Tracy. “Now, artists can achieve great-looking real time and great-looking physically-based renders with a workflow that they already know.” 

Importantly, the software also introduces V-Ray Light Baking, enabling artists and designers to bake V-Ray lights (including IES) directly into Unreal with full GPU acceleration, for the highest-quality real-time illumination. This ensures that the lighting in the V-Ray rendering is well matched to the real-time experience in Unreal Engine.

“We’re excited to bring our Academy Award-winning ray-tracing technology to Unreal Engine and see what amazing content artists come up with, and make their lives a little easier in the process,” says Tracy. “Working with Epic has been great, and from a development standpoint, it helps that UE4 is an open platform. I think the combination of V-Ray and Unreal Engine is a natural fit for any studio that has V-Ray in their pipeline and is interested in using Unreal.” V-Ray for Unreal is available now. For pricing and availability, or to download a trial version, visit the Chaos Group website.

“If fidelity to V-Ray rendering and the V-Ray workflow is most important to customers, then this is a great solution for our joint customers. No one is going to match a V-Ray scene to Unreal Engine better than the creators of V-Ray,” says Pierre-Felix Breton, Technical Product Manager for Epic Games. “It’s also the only solution if you want to bring V-Ray scenes from Maya, SketchUp, and Rhino into Unreal Engine, since Unreal Studio doesn’t support reading V-Ray scene data from those tools. Unreal Engine is the only real-time engine Chaos Group supports, so this is a great endorsement.”

Now, customers can retain their investment in V-Ray knowledge as they transition to real time, while they explore what’s possible with Unreal Studio. Where V-Ray for Unreal is all about fidelity to V-Ray and V-Ray rendering, Unreal Studio is more focused on scene structure, metadata, and the ability to optimize assets for interactive experiences. 

With support for 3ds Max, Revit, and SketchUp Pro (not to mention a wide range of CAD formats), Unreal Studio is an ideal partner to V-Ray for Unreal. Its Datasmith feature set not only provides import capabilities but also data optimization tools, which can be used in parallel with V-Ray for Unreal. Along with Datasmith, it offers one-to-one ticketed support, targeted learning, and industry-relevant templates and assets. Why not download the free beta today?

Look-development workflows in Unreal Studio with Substance and X-Rite

Striking a balance between crafting material realism and finding your creative spark is often one of the challenges in look development. Pairing the right tools together, however, can give you the technical palette you need to tap into the full realm of design possibilities.

In our recent webinar, we introduced viewers to Unreal Studio’s powerful look-development tools, and showed them how to effectively light, texture, and render an automotive asset in stunning detail with Unreal Studio, Substance, and X-Rite. In case you missed the live event, we’ve brought the recording to you here.

Join Daryl Obert, Design Specialist at Epic Games; Wes McDermott, Integrations Product Manager at Allegorithmic; and Dr. Marc Ellens, Senior Software Engineer and TAC Evangelist at X-Rite, as they guide you through best practices in their respective tools.

You’ll learn about:

  • Effectively lighting, texturing, and rendering an asset for look development
  • Establishing proper lighting, post-processing, and camera setups
  • Working with measured materials with X-Rite’s AxF component
  • Unlocking creative possibilities with Substance

If you’d like to follow along, download the project files. Looking for more webinars? Check out the full series here.

Leveling Up: What Is Level 2 Automated Driving?

The Society of Automotive Engineers has designated six categories of autonomous driving, ranging from Level 0 to Level 5. However, the ongoing development of self-driving cars has produced advanced technologies that can improve vehicle safety now, adding new distinctions to automated and autonomous driving features. Level 2 automated driving is defined as systems that provide Read article >

The post Leveling Up: What Is Level 2 Automated Driving? appeared first on The Official NVIDIA Blog.