ReadWriteDrive is an ongoing series covering the future of transportation. In December, this series is presented by Buick Regal.
Nvidia, the Santa Clara, Calif.-based chipmaker, is well known for inventing the graphics processing unit, or GPU. Its chips have long souped up gaming PCs, laptops, workstations and supercomputers. But you might be surprised to learn how the company is revving up cars to become sophisticated, sensor-driven, connected mobility machines.
Four million cars today already have a 21st century tiger in the tank—in the form of Nvidia’s Tegra chips. Another 25 million more cars are in the pipeline—by virtue of relationships with long list of high-end German luxury auto brands like BMW/Rolls Royce, Volkswagen/Audi, and Aston Martin. Japanese and American car companies can’t be far off.
Nvidia’s automotive development kit, called Jetson, is an under-the-hood car-stereo-sized box that provides all the I/O connectors a modern car needs, including USB, Ethernet and HDMI. Nvidia system-on-a-chip processors—essentially fully functional, self-contained computers—power the instrument clusters, navigation and infotainment. It’s what provides the computing horsepower for the giant dual-touchscreen console on the Tesla Model S.
Nvidia provides some of the most complex 3D rendering for games and technical design—so it has a decisive head start in the brave new world of automotive computing. Other players—Qualcomm or even Apple—will make a similar transition from consumer electronics to cars, according Thilo Koslowski, an analyst of vehicle information and communication technology at Gartner.
“Yet Nvidia is pushing the envelope,” said Koslowski. “It’s like a Ferrari versus a Volkswagen.”
Where Sensors Meet The Road
As cool as those vivid digital dashboards are, they amount to child’s play compared to what Nvidia has in mind for its automotive-grade Tegra processing systems. High-performance processors allow car companies to design virtual vehicle prototypes on screen, and then run precise aerodynamic simulations in virtual wind tunnels or accurate ersatz road testing of traction control systems or crash events. Now, the technology is being used to interpret and integrate an ever-widening stream of data from sensors.
The list of critical components on today’s cars now includes cameras, radar, sonar, and laser sensors, or lidar.
See also: Seven Ways 3D Lidar Is Transforming Our Physical World
“A CPU [central processing unit], GPU, image processor, audio processor, and video processor are all baked into this tiny thing,” said Danny Shapiro, senior director of automotive at Nvidia, referring to a component the size of a thumbnail, embedded on a board not much bigger than the size of a playing card. That board houses the memory and components needed to make the device function like a standalone computer. I spoke with Shapiro on the sidelines of the Connected Car Expo at the 2013 Los Angeles Auto Show.
“It’ll run on Linux or Windows or Android,” explained Shapiro. “Now, software gets laid on top of this incredible processor power to do whatever the automaker wants it do.”
Nvidia also supplies a wide range of software libraries that perform algorithms, thus speeding up development time for automotive programmers and designers. Nvidia provides the hardware and software, but not the apps—the same kind of relationship the company has with gaming companies like Entertainment Arts, Ubisoft and Valve.
UI Composer, Nvidia’s authoring system, speeds up design of instrument clusters with built-in 3D objects like gauges and dials, and scripts for how they move. These aren’t canned screens—more like runtime engines.
“This comes right out of gaming,” said Shapiro. Unlike gaming graphics—designed for fun—the rendering and interpretation of data from car-based systems (whether in the car’s development, and even more importantly when carrying you and your loved ones on the road) has to be 100 percent accurate. It’s a matter of life and death.
The technology also means much greater customization for car owners. So you don’t like the dashboard layout on the 2014 Audi 3? No problem. Download the instrument display from the vintage 1970 Audi 100, rendered with such amazing realism that you’ll think the speedometer is purely analog. The system directly accesses engine speed and performance directly from the car’s CANBUS computer network.
Moreover, automotive superprocessors mean that car companies—generally considered technolaggards taking three or four years to develop a new engine or transmission—can start to keep up with the pace of innovation in consumer electronics.
“Pop out the old module and pop in a new one. Each vehicle model, year after year, can have a more powerful system without redesigning it,” said Shapiro. “Just like what happens with phones.”
Car Talk
We could discuss what this means for richer navigation and streaming media in the car—cool stuff, but it’s far more intriguing to consider the vehicle’s core functions, such as computer-based accelerating, braking, and steering.
The Holy Grail is object detection and natural language processing, so we can get our Hasselhoff on. At the 2013 CES electronics extravaganza in Las Vegas, Audi announced traffic-jam assist, which uses cameras and radars to detect congestion, and with driver approval, engages advanced cruise control to maintain a constant and safe distance from the car in front of you, while automatically steering inside your lane. This signals the future direction of the Nvidia onboard platform.
At that show, Nvidia demonstrated integration of Google’s Street View feature in the Audi A7’s navigation system. Nvidia is expected to come back to CES in January 2014 to announce advances, bringing vehicle autonomy one closer step to reality.
Imagine the processing power needed for this scenario coming soon: The car’s processing unit is running the dashboard, rendering speed and engine functions as you like, while the ultra realistic, Street-View-fed navigation system guides you to a destination set by voice command, as the kids enjoy a streaming Netflix movie. A camera aimed at the roadside detects speed limit signs, and presents them on the dashboard. Radar and Lidar data are run in algorithms 30 to 60 times a second to keep track of all traffic, differentiating between other cars, or maybe a kid running across the street—readying the car to apply brakes as necessary.
Meanwhile, an inward-facing camera handles driver state monitoring.
“The same kinds of processing power used for pedestrian detection will do blink detection, and run the algorithms to determine if somebody is distracted or falling asleep,” said Shapiro. “Who just got in the car? Let’s adjust the seats, radio and mirrors for that person.” Talk about a smart car.
This scenario is imminently possible, but not without very fast and efficient processors.
“We’re developing what is essentially a mobile supercomputer for cars that can handle all of this,” Shapiro said.