AI Pioneer Thinks Computers Can Get a Feel for Human Emotions

If you think computers can’t understand emotions, you’ve never talked to Rana el Kaliouby.

A pioneer in the field of artificial emotional intelligence (emotion AI), el Kaliouby is also the author of Girl Decoded: A Scientist’s Quest to Reclaim Our Humanity by Bringing Emotional Intelligence to Technology. In a recent 4Front podcast, she talked about her childhood in the Middle East and how she found her way into the burgeoning field of emotion AI.

Born in Cairo, Egypt, to parents in the high-tech industry, el Kaliouby got her bachelor’s and master’s degrees in computer science at the American University in Cairo before heading to the University of Cambridge to complete her Ph.D. Her plan was to obtain her doctorate and then return to Cairo to teach at her alma mater. But that all changed when MIT professor and AI pioneer Rosalind Picard gave a talk at Cambridge, and el Kaliouby was among a handful of students to meet Picard in person. It was far from a simple meet-and-greet, as the two women found they agreed on the future path of emotion AI technology, also known as affective computing. Their opinions aligned so closely that Picard quickly offered el Kaliouby a research job at MIT.

Picard and el Kaliouby eventually founded Affectiva, an MIT spinoff company specializing in software that detects human emotions from voice and facial expressions. Some of Affectiva’s earliest customers are marketers, who use the technology to gauge reactions among test subjects to their products and brands. In fact, about 25 percent of Fortune Global 500 companies use Affectiva’s technology to test content. More recently, Affectiva has expanded into other industries, including automotive, where the company is working to develop safety systems that can detect whether drivers are distracted or sleepy.

The market for such affective computing applications is gaining momentum. According to a recent Market Growth Reports report, the global affective computing market will expand at a whopping 34.2 percent compound annual growth rate, from $886.9 million in 2021 to $5.18 billion by the end of 2027. The report noted two major categories—emotional speech and facial affect detection—as drivers for much of that growth.

El Kaliouby is also enthusiastic about emotion AI’s other avenues of potential, with applications ranging from teleconferencing to mental health.

“We’re on this mission to humanize technology,” she says, “and the way we do that is we really believe that AI and technology in general has a lot of smarts. It has a lot of cognitive intelligence, but it really has no emotional intelligence at all.” Human intelligence includes not just cognitive ability but also emotional awareness, and el Kaliouby believes that’s also true for technology—“especially devices and technologies that are so deeply ingrained in our everyday lives.”

LISTEN NOW

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Youtube
Consent to display content from Youtube
Vimeo
Consent to display content from Vimeo
Google Maps
Consent to display content from Google
Spotify
Consent to display content from Spotify
Sound Cloud
Consent to display content from Sound