This Emotionally Intelligent Device Is Helping Kids with Autism Form Bonds

"There was an instantaneous smile and alertness in his eyes that has never really been there before."

|
Mar 20 2018, 4:00pm

Brain Power

In June 2015, Ned Sahin paid a visit to a 23-year-old man named Danny who is on the autism spectrum. Danny can't speak, can't care for himself, and can't recognize or respond to human emotions. For most of his life, he's lived in a residential care facility in upstate New York.

Sahin is a neuroscientist and the founder of Brain Power, a tech company dedicated to creating wearable AI systems to help people with brain-related challenges like autism. That morning, Sahin brought Danny a pair of Google Glasses equipped with a program designed to help children with autism. Unsure if Danny could understand words, Sahin modeled the glasses himself and mimed instructions. Danny’s caretaker warned that he might fling the $1,500 device across the room in frustration, but Sahin handed it over, undeterred.

Not long after that, for the first time in as long as anyone could remember, Danny was calm. His arms, which usually swung back and forth, remained still. For a brief moment, he even leaned against his caretaker with affection.

“There he was, transformed,” Sahin says. “And I knew that I was onto something.”

Brain Power's flagship product, Empower Me, is a kind of digital coach that teaches social and cognitive skills to people with autism through emotional recognition. Unlike apps for autistic children, which are designed for a tablet or smartphone, Empower Me runs on any smart glasses that require the child to look up and engage with his or her surroundings. Empower Me officially became available to the public in November through the company’s Indiegogo campaign.

Brain Power is just one of several competing boutique startups in the fast-growing market of affective computing, which now also includes companies like Emotient, Realeyes, Beyond Verbal, and Sension. The ability for AI to recognize and respond to feelings has had applications across fields as diverse as marketing, gaming, education, and most recently—with a surge in emotion-sensing wearable technologies—healthcare.

One market research firm predicted the global affective computing market to grow from $12.2 billion in 2016 to almost $54 billion in 2021. For people with cognitive disabilities like autism—for whom the inability to recognize emotions is a major social impairment—machines can be a powerful tool to emotionally interact.

Sahin has been studying the brain for years. After earning degrees in cognitive neuroscience from MIT and Harvard, he began focusing his research on autism. “There was a moment at an MIT symposium where it hit me just how wide the gap between the medical system and the educational system was,” Sahin says. That lightbulb moment also happened to coincide with the time Google released their first version of Google Glass. Since 2013, Sahin has been trying to fill that gap with the latest in wearable computing and AI.

Empower Me uses emotion recognition software developed by Affectiva, an emerging leader in the field of emotional artificial intelligence. Almost three decades ago, Rosalind Picard, a professor at the MIT Media Lab, began looking into the intersection of emotions and machine learning. Most people at that time considered such a thing to be irrelevant, but Picard believed that giving machines emotional awareness was crucial in writing software that interacts with people. In 1997, she published her findings in a book, Affective Computing.

Without knowing it, Picard had given a name to the now extremely lucrative field of computer science that recognizes human emotions. The book set Rana el Kaliouby, then a student at the American University in Cairo, on a mission to create an algorithm that could read faces. She saw parallels between problems computers had with facial recognition and those of people with autism, and so she grounded her early research in using affective computing to assist autistic people with emotion identification.

In 2009, along with Picard, she founded Affectiva, a spin-off of an MIT Media Lab. Its goal was to help people with cognitive disabilities navigate social situations. As the software developed, interest from corporate sponsors began to overwhelm the team’s autism research. Kaliouby and Picard’s Affectiva became a model for the myriad ways emotion recognition could help companies form deeper connections with customers, whether it was Toyota better understanding driver behavior or Pepsi gauging consumer preferences. Brain Power was one of Affectiva’s earliest partners, and a way for Kaliouby to fulfill her initial goal of helping people with autism build emotional connections through technology.

Kaliouby believes that as chatbots and virtual assistants become more emotionally aware and able to recognize social cues, cultural norms, and variations in expression, products like that of Brain Power will become more and more common in the treatment of brain-related challenges like autism. “Using Emotion AI to help people build stronger connections with technology and other people is a really exciting concept,” Kaliouby says. “It has a lot of potential across healthcare and other industries.”

Like almost every product in the field, Affectiva’s signature software, Affdex, relies on the premise that human emotions are universal. The software scans for a face; if there are multiple faces, it isolates each one. It identifies the face’s main regions and how they move—the furrow of a brow, the widening of the lips—to build a detailed model of the face. The algorithm then identifies an emotional expression by comparing it with the millions of others it has previously analyzed. It scans for “micro-expressions” to differentiate between genuine and fake pain, between spontaneous bursts of joy and social smiles, and to detect fleeting moments of anger or fear that are imperceptible to the human brain.

Brain Power takes the recognition process a step further. In generates a little emoji—a smiley face or an angry face—based on the data provided by the embedded AI. The cognitive capabilities of many people with autism are impaired by information overload, so Brain Power essentially gives the autistic person a “decoded” version of the other person’s facial expression, one with “no extra information,” as Sahin puts it. ­­­

The newest versions of Empower Me engage the user in a variety of games with rising difficulty levels. In one app, Emotion Charades, instead of one emoji, the autistic person sees two—one on either side of the other person’s face—and with a voice command or a tilt of the head has to determine which one most accurately describes the expression of the person in front of them to earn points. The emotional intelligence built into Empower Me’s suite of apps is not only designed to improve the user’s emotional understanding but can measure if he or she is feeling stressed or anxious through changes in breathing or heart rate and adjust the experience accordingly. The device will know, essentially before the parent, when their child is “upset or hungry or in danger,” Sahin explains in a press video.


More from Tonic:

The last few years have seen an explosion of mobile apps and startups like Brain Power integrating emotional awareness to provide mental and physical healthcare beyond the capacities of the traditional medical system. The Israeli-based emotion analytics company Beyond Verbal recently became the first company to use vocal intonations as biomarkers for neurological conditions like dyslexia and Parkinson’s as well as non-neurological disorders like heart disease. Ellie is a ‘virtual therapist’ developed at the University of Southern California’s Institute for Creative Technologies to treat individuals with PTSD and depression. At the University College of London, professor Nadia Berthouze is using affective computing to develop software that can measure pain.

Cathy Penny first got introduced to Empower Me through her daughter Danielle, an intern at Brain Power when the company was just beginning its clinical trials. Penny’s younger son, Jack, is a 21-year-old on the autism spectrum who, for all practical purposes, is non-verbal. His special needs education is complemented with weekly speech and occupational therapy visits. Penny remembers very clearly when Jack first tested some of the company’s early software.

“I really can’t describe it with words,” she says. “But when Jack put the glasses on and looked back at me, there was an instantaneous smile and alertness in his eyes that has never really been there before. Any parent or loved one of a child with autism knows the difference between an artificial smile and genuine happiness. For the first time, I felt like it was genuine.”

Penny knew her son could understand basic emotions like happiness and sadness, but watching him play Emotion Charades told her he could also understand more complex things like anger, frustration, and surprise. For more verbal kids and young adults, the highest levels of the app get at a deep level of emotion processing. After they correctly identify the emotion displayed by the person in front of them with a slight tilt of the head, the parent or caretaker will receive a prompt for their communications-impaired child. Emotions become virtual currency.

For young adults, skills like these will help them get a date or nail a job interview,” Sahin says.

In 2014, the story of the unusual relationship between a little boy on the autism spectrum and his smart phone’s virtual personal assistant was the subject of “To Siri, With Love,” a popular New York Times article (and now a book).

Author Judith Newman relates how the iPhone feature enabled her son Gus to have sustained conversations for hours on end about trains, buses, and “the difference between isolated and scattered thunderstorms.” Machines are infinitely patient. Humans are not. Siri gave Gus lessons in etiquette when she replied, “I don’t need much sleep, but it’s nice of you to ask” to Gus’s good-natured “Goodnight.” She forced him to enunciate his words in order to be understood.

Gus’s intimacy with Siri could reach new heights by 2025, believes Richard Yonck, who in March of this year released a book on emotion AI, Heart of the Machine. He predicts that a chip to make most devices emotionally aware of their surroundings will exist in less than a decade. “I think just in the next year or two," he says, "we’re going to be surprised by how much more emotionally aware different virtual personal assistants like Siri, Alexa, or Cortana will be."

Envisioning what Yonck calls an “emotion economy" takes little imagination. Last year, Apple acquired the emotion recognition firm Emotient. If Yonck’s predictions are true, within the next decade our smartphones might warn us to avoid sending an email when we’re angry, or suggest taking a power nap when our attention starts to wander. Already the iPhone’s pre-installed Health app aggregates data from other apps and medical providers to track fitness, sleep, and nutrition, all of which could be used to build emotional profiles on its user. The Apple Watch, another product in Apple’s line of mood-targeted research, can track a user’s mood throughout the day based on externally developed apps like EmoWatch.

Yoram Levanon, the chief scientist of Beyond Verbal, hopes to combine work in emotion analytics and disease detection. “Picture living in a smart home where the system would be able to monitor both your emotions and your mental health,” Levanon says. For instance, a user displaying emotions of loneliness for an extended period of time would be alerted of possible signs of developing Alzheimer’s or heart disease.

The marriage of the glimmering emotion economy and the traditional healthcare system doesn’t come without privacy concerns, especially when the technological capacity to measure emotions via someone’s face or tone of voice is in the possession of private businesses. “The creation of an emotion economy by definition amounts to the commodification of emotion,” Yonck says. This tap into the unconscious will likely mean a shift in how we think about privacy.

Imagine sitting in a job interview, one in which you bend the truth—just a little—about your proficiency with certain Microsoft applications, all while your potential employer receives live updates about which of your statements are truthful, as well as your shifting degree of agitation. Human, a UK-based start-up whose software quantifies facial expressions, has partnered with banks seeking to do just that. They have also worked with sports teams to track players’ mental health before big games and with an unnamed government agency in the UK to single out individuals displaying “extreme emotions” in a crowd, to minimize suicide risk. Its CEO, Yi Xu, wouldn’t say how accurate the software is but claimed it surpasses the human average “by far.”

Xu skirted the question of privacy and surveillance that software of this sort raises. By using artificial intelligence, “we just want to minimize human bias,” she told me. But Sahin was quick to draw the distinction that Brain Power relies on emotion decoding, not facial recognition. “I don’t need to know who you are to know how you’re feeling,” he says. “Everyone smiles when they’re happy and drops their jaw a little when surprised. It’s what makes us human, collectively human. It’s not what makes us Joe or Mariana.”

In fact, Sahin argues that the boom in digital healthcare, and particularly mental digital healthcare, will offer patients a higher degree of privacy than traditional patient-doctor interaction. “There is no privacy in having the psychologist know all about your mental health and potentially your employer or your colleagues when you go to receive that mental healthcare,” Sahin says.

For parents of kids with cognitive disorders like Cathy Penny, who are lucky enough to afford therapy to begin with, one of the most frustrating things can be trying to translate what their child learned during that weekly 45-minute to hourly session to everyday life. The standard $2,700 Empower Me system comes with a set of Google Glasses, Brain Power’s suite of smartphone apps, and a web-based dashboard that gives parents and physicians an objective way to measure and demonstrate their child’s progress.

Jack still goes to therapy, but his mom thinks that Empower Me gives him an alertness in and out of school that 18 years of sessions didn’t. “With traditional therapy it’s a long, long road where the progress can be glacial,” Penny says.

Every day, machines get better at digesting vast amounts of information. “As we get into more complex diagnoses that include patient history, genetics, updated research, and so forth, only a computer will be able to consider all those levels of information properly,” Yonck says. He thinks we’re still a long way from therapists and physicians being replaced based on their ability to be emotive, but believes we are going to be doing a lot more of what he refers to as “co-work,” increasingly working in conjunction with technology to be “more efficient and productive than ever before.”

For many children with Autism Spectrum Disorder, technology offers an avenue to communicate their thoughts, wants, and needs that humans can’t. Systems like Empower Me show that machines don’t always isolate us, but can help us better engage with our surroundings. Six years after he began using Empower Me, Jack’s favorite activity is still watching videos on Youtube. Penny’s favorite activity now is talking to her son and watching him respond with intent, an intimacy between mother and son that has only been possible through the kindness of machine intelligence.

“You can’t really teach that,” Penny says. “Not in a natural sort of way.”

Sign up for our newsletter to get the best of Tonic delivered to your inbox weekly.