Live from TEDWomen

Toward tech that can understand how we’re feeling: Rana el Kaliouby speaks at TEDWomen 2015

Posted by:
Rana El Kaliouby speaks at TEDWomen2015 - Momentum, Session 1, May 28, 2015, Monterey Conference Center, Monterey, California, USA. Photo: Marla Aufmuth/TED

Rana El Kaliouby shares her work creating an “emotion engine” that will help computers and devices understand how we are feeling — and react accordingly. Photo: Marla Aufmuth/TED

Most human beings can tell the difference between a smirk and a smile. Our computers and devices, however? Not so much.

Fifteen years ago, Rana el Kaliouby left her home in Egypt to study computer science at Cambridge. With her husband at home, she found herself homesick. “I realized I was spending more hours with my laptop than any other human,” she says. “Despite this intimacy, my laptop had no idea how I was feeling — whether I was happy or having a bad day.”

This gave her an idea. “Technology has lots of IQ, but not EQ,” she says. “What if our technology could sense our emotions? What if our devices could sense how we feel and react accordingly just the way an emotionally intelligent friend would?”

El Kaliouby started investigating this as a research project at MIT, then founded a company, Affectiva, to make the tech available to developers. She asks for a volunteer so she can show us how Affectiva’s “emotion engine” works. On an iPad, the engine senses the bounds of our volunteer’s face, and constellation of points appear at the corners of her eyes, the outline of her lips and around her nose. As our volunteer smiles, frowns and acts surprised, bars that measure joy, disgust, valence and engagement get longer and shorter, flipping between green (for positive emotions) and red (for negative emotions).

Affectiva, says el Kaliouby, has collected the world’s largest emotion database: 12 billion emotion data points collected from 2.9 million face videos from volunteers in 75 countries.

“It blows me away that we can now quantify something as personal as our emotions and that we can do it at scale,” she says. “So what have we learned?”

She walks us through a few key learnings.

  1. There are gender differences. Women tend to be more expressive than men; they smile more often and longer than men.
  2. There are cultural differences. While women in the US smile 40% more than men, the same difference wasn’t found in the UK.
  3. There are age differences. People who are 50 or older are 25% more emotive than younger people.
  4. We are expressive all the time. We show emotions “even when sitting in front of devices alone,” el Kaliouby says. “We express emotions when emailing, shopping online, even doing our taxes.”

So what are the implications of this? El Kaliouby imagines emotion-sensing glasses that could help those who are visually impaired or on the autism spectrum interpret other people’s emotions. She imagines teaching apps that could speed up when a student looks bored. She jokingly imagines a fridge that knows when its owner is stressed and autolocks to prevent emotional eating.

“When I was at Cambridge, what if I had access to a real-time emotion stream and could share that in a very real way with my family?” she asks.

“As more of our lives become digital, we are fighting a losing battle,” she continues. “What I’m trying to do instead is bring emotions into technology … We have a golden opportunity to reimagine how we connect with machines and therefore how we, as human beings, connect with one another.”