Signs, waves, and honks are all part of traffic vocabulary, providing drivers with an efficient way to tell pedestrians and other drivers what they are about to do.
In the autonomous future, cars should be able to talk to one another, but pedestrians, cyclists, and non-driverless cars will still need visual cues to tell them what is about to happen.
See Also: Google teaches its car to be nice to cyclists
Drive.ai, an autonomous vehicle startup, wants to build that new language. It unveiled its product and strategy last week, with a focus on self-driving communication.
“The self-driving car is the first social robot that a lot of humans will interact with,” said Carol Reiley, co-founder and president of Drive.ai. “What we look at is how do you now replace all these social cues that humans give each other and how do you build trust and transparency.”
Drive.ai’s first attempt at an answer to the communication question comes in the form of a roof-mounted digital billboard (seen above). It projects information from the car to people in the near vicinity, indicating if it’s safe to cross the road.
What’s the sound of an angry AI?
The communications platform will also use a variety of sounds to grab attention. Drive.ai is experimenting with sounds not usually associated with cars, which may be better at indicating the situation. Google has run similar tests on its own self-driving project, experimenting with ocra noises.
Drive.ai plans to bundle the communication platform with its self-driving system and sell it to automakers. The self-driving platform uses “deep learning” to teach a car fleet about the roads, with the goal of the car becoming self-sufficient.
It doesn’t look like the startup will manufacture cars or launch a ride-hailing service.
It may take a while for people to recognize the new noises and visual alerts, especially if hundreds of operators use different alerts, but once one comprehensive platform is adopted it could make the roads a lot safer for pedestrians and cyclists.