When we interact with another person, there are two main typical reactions that pull us in different directions. We can either feel engaged and responsive, or turned off and stressed. In one situation we are interacting with someone who is reciprocating and engaging with us. In the other, we're confronted by indifference.
A middle ground between the two is partial attention, someone who is going through the moves. In this situation, a person is physically present and responds to our questions or provides information, but in a mechanical or rote manner. As when using a script and telling us about the rules.
If an halfhearted contact has no element to suggest a feeling, no connection is possible. Partial empathy, like partial attention, leads to diminishing returns.
A human quality
In 1975, Dr. Edward Tronick#'s still face experiment# demonstrated that we feel the impact of empathy early in our lives. He says after three minutes of “interaction” with a non-responsive expressionless mother, an infant “rapidly sobers and grows wary. He makes repeated attempts to get the interaction into its usual reciprocal pattern. When these attempts fail, the infant withdraws [and] orients his face and body away from his mother with a withdrawn, hopeless facial expression.”
The still face experiment has been replicated successfully and used to test many hypotheses from person perception and communication to cross-cultural differences. Many aspects of our social cognition and behavior are already in place at a very young age.
This may explain our automatic annoyance with interruptions in conversation flow — averting eyes too long to stare at a screen, or turning around to talk on the phone, or not making eye contact at all with dinner partners while texting or chatting online. The other person's body language becomes detached or not correlated with our own.
It's the opposite of connection and we associate negative emotions with this kind of interaction. As adults, our appreciation of emotional involvement matures. We can detect a fake smile — no sparkling eyes — and we can call BS on someone “just doing my job” efficiently.
Filmmakers Julie Bayer Salzman & Josh Salzman (Wavecrest Films) set out to communicate visually# how a group of children deal with emotions so we can learn from them.
Desirable for machines
As technology continues to advance, the future looks more and more human. The developments in Artificial Intelligence (AI) are focusing on the ability to recognize emotions from facial expressions, words and body gestures. We're now talking about robots adjusting behavior in response to human cues.
Aldebaran Robotics designed Japanese robot Pepper# in a human shape with the goal for it “to be a genuine day-to-day companion.” Efficiency is something machines can do much faster and likely better, when properly programmed, than humans. But can they do empathy so it feels genuine?
To answer that question, we can look at the four qualities of empathy — 1./ ability to see the world as others see it; 2./ to be non judgemental; 3./ to understand another person's feelings; 4./ to communicate our understanding of those feelings.
The term empathy has been used to refer to two distinct, yet related human abilities — mental perspective taking and the vicarious sharing of emotion. Scientists call the first cognitive empathy and the second emotional empathy.
Cognitive empathy involves choices based on what we know of a situation and a person. Which is why it matters to human relationships. It's the piece that involves putting ourselves in someone else's shoes, say a friend or customer, that may be difficult for AI. The two-part process is know oneself, then feel the pain.
Two components to get right
To know oneself means being able to tap into personal motivations, strengths, weaknesses, history of successes and failures and high and low points. To feel for the other there needs to be enough overlap between personal experience and that of the other.
When we relate to someone, we recognize in the struggle or joy that someone else shares with us our own. Researchers have found that people can read the six basic human expressions# and that they are universal across cultures.
Pepper (and AI) can be programmed to recognize the basic six. But our emotions have greater nuance — like boredom, playfulness, embarrassment and also ambiguity. We may act brave but feel scared. Our ability to recognize and even mimic another person's emotions is what makes telling a personal story valuable.
We respond physically and not always visibly to emotions. Our reactions can signal different emotional states and it's our survival instinct, experience with a person, and context that help us decode them more accurately and feel empathy.
Is near enough good enough?
There's progress on the AI front. Military-funded research has developed artificial intelligence that can read and respond to human emotion#. Researchers have been able to pick apart emotional intelligence, a quality that is innately human, and reduce it to logical procedures and algorithms.
In the 1990s, psychologists Salovey and Mayer recognized emotional intelligence as a set of knowledge and skills distinct from other forms of intelligence. They were the first to define it as “the ability to monitor one's own and other's feelings and emotions, to discriminate among them, and to use this information to guide one's thinking and actions.
It turns out that it's the level of interaction that matters. Which is good news for the development of virtual agents. Developers at the University of Southern California Institute for Creative Technologies (ICT) are working on a SimSensei project funded by DARPA.
The new generation of AI consists of virtual agents that can display high levels of artificial emotional intelligence and engage convincingly in back-and-forth interactions with people, according to Albert Skip Rizzo#, psychologist and director of medical virtual reality at ICT.
What's interesting is that a session with Ellie, the star of the SimSensei project, starts with building rapport through background questions. The conversation kickoff is designed to start assessing emotional temperature. This is something worth reintroducing to human interactions, especially in service situations.
A machine is also by definition non judgmental — we may feel less social pressure to impress and get on with our question or story. Which explains the program's success for psychological support. When near enough gets the important part right, its usefulness increases.
The human challenge
When we experience empathy, we are able to focus on the other person. It's a highly desirable quality to have when working on ideas and products that emphasize service. What is desirable becomes more achievable with empathy.
Social networks may seem to bring out the worst in humanity. But that is not always the case. A few years ago, when news of the tragic earthquake and tsunami that affected so many lives in Japan broke, people responded in real time. Empathy was the protagonist again at the winter Olympic Games as they unfolded in 2014.
Cognitive empathy, or our conscious ability to understand someone else’s perspective is a uniquely powerful —if often overlooked— tool for transforming and improving societies. We want and need to feel in touch with our fellow human beings.
“Deep down, I think there is a human desire for connectedness. We want to be engaged, trusted, and in touch. The troubling thing is that we are not often taught how to do that, so we approach problems at a distance, without any true sense of connectedness to the person for whom we’re solving the problem,” says Jerry Haselmeyer.
As we continue to develop AI to augment our capacity, the challenge is not to take the middle ground of partial empathy when it comes to the support people provide. We do sense connection, and its loss leads to diminishing returns. Empathy is a choice we should make, because we can.
Resources:
Empathy and Emotion as Seen Through the Eyes of Children
Why Cognitive Empathy Matters and How it Works in Tandem with Reason
Converging Two Narratives of Empathy