Affectiva, SoftBank Partner to Give Robots an Emotion-Sensing Upgrade

A new partnership will implement Affectiva's Emotion AI into SoftBank's Pepper robot to allow it to recognize and respond to a more complex array of human emotions.

Chris Wiltz

August 29, 2018

4 Min Read
Affectiva, SoftBank Partner to Give Robots an Emotion-Sensing Upgrade

Pepper is often seen in customer service applications. With Affectiva's Emotional AI, the robot will be able to recognize a wider, and more subtle, range of human emotions. (Image source: SoftBank Robotics)

If you follow the robotics space, you're probably familiar with Pepper. This humanoid robot from SoftBank Robotics has found a place in hotels, banks, and retail stores all over Japan, where it handles customer service and concierge tasks. The draw of the robot has always been its ability to identify four basic emotions—joy, surprise, anger, and sadness—via a combination of face, voice, and body language recognition, and adjust its behavior accordingly.

But a newly announced partnership between SoftBank and AI company Affectiva is aiming to make Pepper even more emotionally intelligent. Boston-based Affectiva is the creator of Emotion AI, a deep learning framework specifically for recognizing more complex human emotions, such as boredom, disgust, fear, and levels of alertness. By implementing Affectiva's Emotion AI into Pepper, SoftBank is hoping to expand the range of emotions that the robot can recognize, and thus augment its capabilities and ability to interact with humans.

“There’s a significant opportunity for robots like Pepper to improve the way we work and live, as we’ve seen through the many roles Pepper has already taken on as a companion and a concierge,” Marine Chamoux, an Affective computing roboticist at SoftBank, said in a press statement. “Our partnership with Affectiva will help us to take Pepper’s abilities to the next level, allowing Pepper to better respond to the many emotional and cognitive states people experience.”

RELATED ARTICLES:

In a statement of her own, Dr. Rana el Kaliouby, co-founder and CEO of Affectiva, said robots are going to need the same social awareness that the people around them have in order to become effective coworkers (cobots) or companions. “As robots take on increasingly interactive roles with humans in many corners of society—spanning healthcare, retail, and even entering our homes—there’s a critical need for us to foster a deeper understanding and mutual trust between people and robots,” she said.

Affectiva's Emotion AI combines algorithms for image recognition with neural networks aimed at learning to find patterns in very large data sets. By training its AI on a very large dataset of faces and expressions (over six million faces representing 87 countries, according to the company), Affectiva has created an AI capable of identifying emotions in real time based on even subtle facial movements and cues.

Prior to the SoftBank announcement, Affectiva has also been working to implement Emotion AI into autonomous and connected cars. Here, a vehicle would be capable of recognizing the emotional states of drivers and passengers—even identifying states such as drowsiness and road rage—and acting or adjusting itself according to the occupant's emotional state. The car may adjust the AC, turn up the music volume to awaken a drowsy driver, or even pull itself over if a driver falls asleep. In emergency situations, it's even feasible to conceive of an autonomous car that could recognize a driver in distress (a heart attack, for example) and divert itself to the nearest hospital while alerting 9-1-1.

Neither company has made a formal announcement as to when the first Pepper units with Emotion AI installed will begin rolling out into markets. But SoftBank's Chamoux said this is “only the beginning” and represents only a step in continuing to evolve Pepper into a robot that can more naturally interact with humans. “The partnership really signifies the next generation of human-machine interaction, as we approach a point where our interactions with devices and robots like Pepper more closely mirror how people interact with one another,” Chamoux said.

Chris Wiltz is a Senior Editor at Design News covering emerging technologies including AI, VR/AR, and robotics.

 

Sign up for the Design News Daily newsletter.

You May Also Like