Self-driving cars might be available and legal as soon as next year, but questions of whether anyone will purchase them remain prevalent, as 75 percent of Americans are still afraid to ride in an autonomous vehicle.
To build trust between the human and a self-driving car, a new Columbia University project wants to enable cars to read human emotions. The project, called BraiQ, will attempt to use biofeedback to highlight what humans want to see during a trip.
“We think as the AI advances, unless we can develop the rapport based on the interaction to teach them to gain mutual trust, humans and machines are not going to be able to interact well,” Sameer Saproo, a scientist at Columbia University and a researcher on the project said to Motherboard. “The autonomous car is just starting out, and the problems we are solving are going to be very prominent in the future.”
Biofeedback will let cars feel your pain….or happiness
In an example video, BraiQ shows how the artificial intelligence responds to two different interests. The interests are furniture and desserts, and the car slows down when it passes the specific interest.
The application is built on the Unity platform for virtual reality, and uses a deep learning program to understand the different interests. BraiQ claims that this program could make it possible for a car to best match a rider’s preferences, which could include driving slower, pulling over, or slowing down to let the driver view something for longer.
This sense of control over the car might foster some trust between the human and the car, though it might also worry riders about what emotions they’re expressing. How does a car interpret sadness or excitement?
We assume the team is taking all of this into consideration, and most emotions will not cause a car reaction. BriaQ want to track facial expressions, eye movement, heart rate, and even brain activity, to have a full picture of the human’s psychological state.
It might be some time before Ford and General Motors are investing in biofeedback for self-driving systems, but we wouldn’t be surprised if the more pro-tech companies like Google and Tesla weren’t looking into the possibility of cars reading human emotions.