Researchers at the University Of Southern California Institute for Creative Technologies (ICT) are developing artificial intelligence which would be capable of emotional intelligence. The end result is ‘empathic robots’ that can better read, process, and respond to human emotions.

Albert Rizzo, a psychologist and director of medical virtual reality at ICT, in an interview with Motherboard.Vice revealed that their latest development, a congenial counsellor christened Ellie could be used as a therapist. But, we’re coming to that later.

A majority of ICT's funding comes from the military which wants to use virtual humans that can be used for training purposes. These virtual agents would be used for negotiation skills training in a civilian clinical setting before the soldiers get their hands on a live one. For example, ICT recently developed applications that can train soldiers to negotiate with an Afghan war criminal, or to interact with someone from another country, particularly Afghanistan or Iraq, by developing cultural sensitivity.

The ICT has set its sight for more advanced uses. The virtual agent would be used for post-traumatic stress disorder (PTSD) therapy.

ICT’s DARPA-funded SimSensei project is developing next-gen AI: virtual agents that exhibit high levels of emotional intelligence and can engage convincingly in interactions with people. Several studies on Ellie, the AI therapist, has shown that she outperforms human therapists at clinical engagement.

Presently, Ellie mostly sees military personnel who have recently returned from deployment and might be suffering from PTSD. The AI therapist engages users in a private head-on interview using natural language and active listening. The virtual agent is equipped with a webcam to track head pose and facial expression. Microsoft Kinect is used to capture gesture and body posture and a microphone to track vocal parameters.

As you can see in the video, a session with Ellie starts with common background questions to build affinity. Then, she casually and conversationally asks users about the common symptoms of psychological distress. The virtual therapist then monitors and analyzes user responses and reacts accordingly with empathetic nods and follow-up questions.

Users seem to prefer talking with Ellie over a human therapist. During on trial, it was found that users who thought Ellie was computerized disclosed more negative emotions and reported more indicators of PSTD and psychological distress than those who believed they were talking to the version of Ellie controlled by a human.

Researchers at ICT used existing research on non-verbal expression to create a list of telling behavioral signals for the AI to refer to, including body posture, 3D head position and orientation, intensity, and frequency of facial expressions and self-adaptors. Additionally, Ellie is also programmed to identify and analyze voice parameters.

Stefan Scherer, who heads the speech analysis part of the SimSensei project, says that there are a number of acoustic signs of depression such as the absence of variances in volume and pitch, increased tension in the vocal tract and folds. These markers can be easily overlooked by the human ear, and this is one area where the AI therapist is better than her human counterparts.

In addition, Ellie has also been equipped with high-level knowledge of emotional intelligence, and sensors that out-perform human observation. Virtual therapists developed by SimSensei are being put to test to help people with high functioning autism develop emotional and social skills for job interviews. They have developed different virtual characters for different behavioral dispositions.