A look at Nvidia's DRIVE PX Platform

Home > News > Content

【Summary】NVIDIA’s DRIVE PX 2 is the open AI car computing platform that enables automakers and their tier 1 suppliers to accelerate production of automated and autonomous vehicles.

Original Eric Walz    May 15, 2017 6:35 PM PT
A look at Nvidia's DRIVE PX Platform

NVIDIA's DRIVE PX 2 is the open AI car computing platform that enables automakers and their tier 1 suppliers to accelerate production of automated and autonomous vehicles. It scales from a palm-sized, energy efficient module for adaptive cruise control capabilities, to a full fledged powerful AI supercomputer capable of autonomous driving.

NVIDIA CEO Jen-Hsun Huang at GTC Europe showing off the DriveWorks platform for self-driving cars

The NVIDIA DRIVE PX technology with Xavier, Nvidia's upcoming AI car superchip, ia a single-chip processor designed to achieve level-4 autonomous driving. The Xavier chip can process up to 30 trillion deep learning operations a second, while drawing only 30 watts of power, just 5 watts more than a 25 watt conventional light bulb.

The high level of performance of Xavier is necessary to handle the massive amount of computation required for the tasks self-driving vehicles must perform. Nvidia's chip was developed in a partnership with Bosch. "This is the greatest SoC (system on chip) endeavor I have ever known, and we have been building chips for a very long time," Nvidia CEO, Jen-Hsun Huang said about Xavier.

Zavier Can Power Deep Neural Networks

The collaboration with Bosch represents the first announced DRIVE PX platform incorporating NVIDIA's Xavier technology. The Zavier SoC can run deep neural networks. Deep learning plays a vital role through the entire computational pipeline for a self-driving vehicle enabling it to get increasingly smarter based on experience.

At a speech in March, at the annual Bosch Connected World Conference held in Berlin, Nvidia CEO, Jen-Hsun Huang detailed how deep learning is fueling an AI revolution in the auto industry. The small AI car supercomputer, is pushing deeper into the areas of sensors, software and services. Self-driving vehicles require unprecedented levels of computing power, due to the profound complexity posed by self-driving.

A high performance processor is power to make sense of the terabytes of data streaming live from a self-driving car's sensors, including cameras, radar, and LiDAR. This is where deep learning comes in. By first developing and training a deep neural network in the data center, the NVIDIA DRIVE PX AI system will eventually be able to understand everything happening around the car in real time, making self-driving possible.

Tesla Adopts the NVIDIA DRIVE PX 2

red tesla.jpg

Tesla Motors has announced that all of its vehicles Model S, X, and the upcoming Model 3 will be equipped with Nvidia's on-board DRIVE PX 2, that can provide full self-driving capability.

The DRIVE PX 2 AI computer delivers more than 40 times the processing power of the previous system, and meets Tesla's requirements to run its own neural net for computer vision, sonar, and radar processing. Tesla's autonomous driving system will be powered by the NVIDIA DRIVE PX 2 AI computing platform.

This level of processing power is needed to achieve what the automotive industry refers to as "Level 4 autonomy," where a car can drive on its own, without human intervention. The number of cars with various levels of autonomy will grow to a total of 150 million vehicles by 2025, analysts project.

The Deep Learning Self-Driving Car

Coded software can't possibly be written that would anticipate the nearly infinite number of things that can happen along the road, Huang said. Instead, deep learning can enable us to "train a car to drive", and ultimately perform far better and more safely than any human could do behind the wheel.

"We've really supercharged our roadmap to autonomous vehicles," Huang said. "We've dedicated ourselves to build an end-to-end deep learning solution. Nearly everyone using deep learning is using our platform."

Nvidia AI Co-pilot


Also introduced recently at the 2017 GPU Technology Conference was NVIDIA's AI co-pilot technology. This system will act as an AI assistant in the vehicle, as well as provide safety alerts of potential hazards outside the car. By monitoring the driver as well as a full 360 degrees around the car, the system works to keep the occupants of the vehicle safe.

"Of course, our goal someday is that every single car will be autonomous," Huang said. "But for the path to then, we'll have AI that will be your co-pilot, will be your guardian, and look out for you."

Powered by deep learning, the AI co-pilot can recognize faces to automatically set specific preferences in the car depending on the driver. The system can also see where the driver is looking, and detect facial expressions to understand the driver's state of mind. Combining this information with what is happening around the car enables the AI co-pilot to warn the driver of any potential hazards.

Prev                  Next
Writer's other posts
    Related Content