Last week we told you about how Chevrolet, a division of General Motors, was bringing an augmented reality (AR) marketing promotion to SXSW in Austin. Now General Motors is kicking it up a notch with some experimental technology that will bring the world of AR to car windshields and provide a heads-up-display (HUD) experience.
The new technology, still very much in the testing phase, uses an array of sensors which track both objects on or near the road, as well as the position and angle of a driver’s head and eyes. By combining the data from these sensors, GM can then project images onto the windshield with lasers to help drivers stay safe when driving.
“Let’s say you’re driving in fog,” says Thomas Seder, group lab manager for GM’s research and development. “We could use the vehicle’s infrared cameras to identify where the edge of the road is and the lasers could ‘paint’ the edge of the road onto the windshield so the driver knows where the edge of the road is,” Seder said.
In other words, it would be like having a fighter pilot’s HUD in your car, except instead of tracking the sky for bogies, your car tracks the road for possible dangers. The display works by coating the windshield with transparent phosphors which emit light when excited by a laser. GM says this is better for the driver because the entire windshield can be used to display information, not just a portion of it like current in-car HUD systems. The technology also includes the ability to recognize and read road signs and alert the driver to when they are driving too fast or if construction is ahead.
The company says that while this exact technology will not be in any cars in the near future, some of the features will start to be rolled into upcoming models. What this likely means is the transparent phosphor windshield will be placed in cars and used to display other HUD information, like speed, gas and other indicators.
The hard part of this technology doesn’t seem to be displaying it; rather, the barrier is in the sensor work between tracking objects on the road and tracking the position and angle of the driver’s eyes. Since it’s much easier to simply display objects that don’t rely on exact positioning for the driver’s point-of-view, it’s likely we’ll see these additions before the true AR experience becomes a reality.
Eventually, however, GM hopes technology like this will make for better turn-by-turn directions and make it easier to find locations upon arrival. We’ve all heard our GPS systems say, “You have arrived at your location!” only to look around and not necessarily know where it is. With this new system, GM hopes they can solve the problem of “the last 100 yards” by displaying indicators of specific locations based on the sensor readings.
This certainly seems like the future of driving, but I wonder if it will be displaced by cars that simply drive themselves. If we can create sensors good enough to find the lanes in the road and nearby vehicles, why not just let the car drive it self and skip the HUD? Either way, its great to see AR taking steps forward beyond marketing and into practical application in a consumer space, even if it is years in the future.