Robots get an empathy upgrade

human robot yoga header image

The single most frustrating thing we do on a regular basis is call customer service.

But what if these automated phone systems were built to detect the tone of the caller, sense their frustration or even get the gist of the color of their language (ie. if they’re using profanities) and then redirect to a supervisor to defuse the situation?

That possibility is in the works. The research firm Invoca found that some younger consumers are not only comfortable with this arrangement but expect to see it happen sooner than later.

“AI can help service reps better understand previous customer actions, observed interests and preference and align this information with available offers to quickly resolve issues,” said Adam Justis, director of product marketing for the Adobe Experience Cloud.

Think about the last time you spoke to a device, whether it was asking Siri for information or requesting a song from Alexa. This action has become so commonplace and in such a short amount of time.

But what if the AI in our lives become emotionally aware as well as intelligent?

It’s becoming more desirable for devices including phones, appliances, vehicles and fitness trackers to be able to pick up on our very human emotions and quirks. Emotion tracking AI, or emotion AI, is a developing technology that has many companies intrigued.

Emotion AI will be an integral part of machine learning in the next few years – the reason being that we want to interact with machines that we like.

Annette Zimmerman, analyst at Gartner

An estimated 10 percent of devices will have some kind of emotion AI capacity by 2022, suggests the research firm Gartner. This emotion AI could be built into devices – picking up on facial expressions by monitoring built-in cameras, for example – or as an add-on that lives in the cloud.

“Emotion AI will be an integral part of machine learning in the next few years – the reason being that we want to interact with machines that we like,” said Gartner analyst Annette Zimmerman at a technology conference in early 2019. “In the future, we will be interacting with smart machines much more than we do today, so we need to train these machines with emotions so they can understand humans better.”

The mood algorithm

In Homo Deus, Professor Yuval Noah Harari goes so far as to suggest that humans are nothing more than biological algorithms, learning what we can about interacting with others like us through centuries of evolution. Computers and smart technology are simply the next phases in that evolution. In fact, it’s possible that when emotion AI becomes more developed, our devices could be even better at dealing with humans than we are.

There are myriad other uses, too. The car insurance industry in the UK is interested in using AI specifically to help fight insurance fraud from car accidents. Research finds that an estimated 20% of UK drivers have lied to their insurance agents about crashes, to the tune of millions of pounds in ill-gotten gains.

Imagine if your refrigerator could pick up on a bad mood or a rough day based on the speed and force with which the door was closed, and could make recommendations for snacks based on your observed preferences? Or a smart car that can sense the driver’s anger and take precautions to keep at a safe speed to prevent crashes? Or a phone that picks up on a bad day at the offices and sends all calls directly to voicemail to be dealt with later?

Where we are on the emotion AI timeine

For now, these ideas all conceptual, but it’s not so far-fetched.

Affectiva, a startup by MIT Media Lab researchers, discussed at a conference in February how they’ve trained machine learning algorithms to learn facial cues including boredom and tiredness that could be integrated into cars’ operational systems to make drivers safer.

There are very early inroads being made by some smartphone apps and connected home devices that can pick up on emotions, but the biggest hurdle to these becoming emotionally intelligent is the lack of contextual information, including recognizing facial expressions and tone of voice.

So much of our communication is non-verbal, with emotions conveyed through gestures, facial expressions, tone of voice, posture, etc. Emotion AI could potentially learn to pick up on all of those things and tune in to our humanity. Amazon, Google, Facebook, Microsoft and other leading tech companies are all spending big money on AI in the hopes of making devices and software even more reliable and sensitive to our needs without us having to explicitly ask for something.

About Futurithmic

It is our mission to explore the implications of emerging technologies, seeking answers to next-level questions about how they will affect society, business, politics and the environment of tomorrow.

We aim to inform and inspire through thoughtful research, responsible reporting, and clear, unbiased writing, and to create a platform for a diverse group of innovators to bring multiple perspectives.

Futurithmic is building the media that connects the conversation.

You might also enjoy
busy vehicle intersection
The role of 5G in autonomous vehicles