Can you teach a robot empathy? This B.C. researcher is trying to find out
Goal is to create robots that ‘treat us like actual humans’

Angelica Lim is explaining how the shiny, white robot in front of us only responds to the command, “Tell me a joke,” when it interrupts her.
“What’s the deal with bananas? I mean, they’ve got orange juice, they’ve got apple juice,” it says in a high-pitched, mechanical voice, emphasizing the word ‘juice’ in each sentence and gesturing with its arms. “Where’s the banana juice?”
This robot and technology are both 10 years old, says Lim, an associate professor of computing science at Simon Fraser University, who also leads the robots with social intelligence and empathy (ROSIE) lab.
While personal AI assistants such as Siri or Alexa can tell jokes, the difference with this robot — named Pepper — is in what happens next.
Its large black and purple eyes focus on Lim’s face, which maintains a neutral expression, for several seconds.
“Oh. Well, this one kills in the robot-verse. Tee hee?”
Pepper has interpreted Lim’s expression to mean its joke hasn’t landed.
This is an important aspect of Lim’s work: teaching robots like Pepper to respond to non-verbal human cues.
The goal, says Lim, is to create robots that, when deployed out in the world (perhaps not as stand-up comics), “treat us like actual humans.”
Inspired by Disney
Lim, 42, whose parents were immigrants from the Philippines, grew up in the Los Angeles area, close to Disneyland.
Seeing Disney characters come to life through animatronics sparked a sense of wonder.
“And I just fell in love with that and also the technology. I always wanted to be a Disney imagineer. So, that's … my dream and I'm kind of living it.”
In graduate school in Japan, Lim was part of a team that created a musical robot, but it was criticized by other scientists because music is a means of conveying emotion and robots don’t have emotions.
This became the question that consumed Lim during her PhD: Could robots ever have emotion? Is this even a good idea?
Understanding unwritten social rules
Lim’s thinking on this question has evolved. She now believes that even if it were possible, the ethical implications of creating something that could feel pain are problematic.
Plus, AI chatbots have taken empathy to something of an extreme, “where they're so overly empathetic that they are triggering delusions and psychoses, and it's absolutely horrible.”
So, her team’s focus is teaching robots how to respond appropriately to human body language and facial expressions.

Consider, for example, a robot delivering supplies in a hospital. How would the robot know not to cut through a group of people talking?
“There's different things that humans automatically know. They just know. And we want robots to just know. But it's hard, because no one's really written down those rules. And so part of what our lab does is try to figure that out," Lim said.
To do this, her team needs to create algorithms from data on facial expressions and body language. Doing that in a way that is unbiased and respects privacy is one thing they are working on, Lim says.
A robotic shopping assistant?
Computer scientist AJung Moon leads the responsible autonomy and intelligent system ethics lab at Montreal’s McGill University, which investigates how interactive, intelligent machines influence human behaviour.
She sees value in having robots be able to pick up on nonverbal cues.
Her team is creating a robotic shopping assistant. She envisions a customer, perhaps with a visual impairment, in a country where they don’t speak the language. This fictional customer has gluten intolerance, a peanut allergy and is looking for a snack. The robot is there to help.
“So if the robot is able to kind of pick up on the cue that says, ‘Oh, you really want to be in and out with the shopping. You are just looking to get the thing and get out of here.’ Maybe you don't want to be externalizing that out loud ... but it's actually quite related to non-verbal cues," she explained.
Moon agrees there needs to be guardrails around the use of such technology. The extent to which robots should simulate human emotion must be carefully considered and closely tied to their intended purpose, she says.

As for Pepper’s crew of robots at SFU, there are 10, donated to the university by a now defunct French company.
They’ve been used to teach coding to high schoolers and to teach Niitsíʼpowahsin, the Blackfoot language, to Indigenous studies students. One is at a school in Surrey teaching students Punjabi.
As for Lim, she's still thinking about making magical creatures come alive, as Disney did recently with a robot version of the snowman Olaf from Frozen.
“I like joy. I like to think that maybe robotics are not just about doing the dull and dirty jobs.”


