The first ever death in an autonomous car in May this year, raised several questions about the continual and full attention of the driven even when the car is in autopilot mode. The Tesla driver was watching a Harry Potter movie when the accident occurred. According to the NHTSA investigation report, he had seven seconds available before he could respond to the pending collision. With the momentum of self-driving technologies, it’s imperative that the driver is in complete control to reduce accidents substantially, where or not he/she is behind the wheels.
Affectiva is an emotion analytics company, which has spent the last eight years building an emotion recognition engine. By gaining insights on their passengers’ mental states, cars may figure out how best to hand control back over to a human driver.
The emotion recognition engine is a set of algorithms that are programmed with a camera to focus on thirty-three diverse facial points of the driver and evaluate expressions to recognize emotions. Cars will be fitted with a camera that looks on the driver’s face, and Affectiva’s algorithms can be fed into and integrated with an autonomous vehicle’s own algorithms.
Affectiva’s emotion recognition technology uses deep neural networks to view and comprehend facial patterns, thus allowing its mechanism to get to know a person better and the ways in which he/she reacts and emotes. It then, identify emotions a driver is experiencing when behind the wheels. Affectiva’s software uses the car’s features to alert distracted drivers and take control of the steering depending on how a driver is reacting at the moment.

Affectiva’s software uses the car’s features to alert distracted drivers and take control of the steering depending on how a driver is reacting at the moment.
As autonomous driving pushes humans into the passenger seat, the need will arise to ensure that the person in the driver’s seat remains aware to emergencies. The most recent advances in autonomous driving has the inclusion of elements such as adaptive cruise control, pedestrian detection, lane detection, vicinity and speed limit controls and more. As cars become more responsive, we’ll need them to be emotionally aware, too.
Certain cognitive elements such as emotions are hard to incorporate in machines. The algorithms in deep neural networks read into behavioral patterns to learn human habits and trends. Affectiva’s software uses the same school of thought to make a car learn different facial patterns. This feature will make autonomous cars more human.
Affectiva plans to partner will car makers to integrate the emotion recognition technology. You can even try out the Affectiva’s emotion recognition technology on Android and iOS smartphones.