How did Siri know how you were feeling? Welcome to the world of intonation analytics.
Humans read emotion in verbal communication all the time: I don't like your tone! -- An edge crept into his voice. -- She spoke soothingly. But the science of sentiment analysis is still relatively new and focused primarily on text.
As explained by data scientist Brian Kolo on the Opera Solutions blog, algorithms have a hard enough time differentiating "that's hot" (as in, good) from "that's hot!" (as in, you'll burn yourself). It seems a stretch to suggest they can understand how we feel based on our intonations.
"It's not what you say, but how you say it," explained Dan Emodi, vice president of marketing at Beyond Verbal. "Vocal intonations convey your mood, attitude, even personality."
The research began 18 years ago, when a physicist and neuropsychologist began exploring the way that babies understand language even before they understand words, Emodi said. The team discovered that intonation transcends language and culture.
"Patterns of happiness, sadness, aggression, introversion, extroversion, apathy, and action-orientation, they're all the same across languages," he said.
On Beyond Verbal's site, vocal analytics are applied to the speech of President Barack Obama, Steve Jobs, and Princess Diana. The company also analyzed Edward Snowden's June interview with The Guardian, finding that his voice betrayed some egocentricity and pride in orchestrating his NSA document leak.
You can listen to me, but do you hear me?
Like most sentiment analysis projects, Beyond Verbal started with a set of samples -- 16,000 audio clips, in this case -- and trained its algorithms using crowdsourced human evaluators. Temper and anger were easiest to detect accurately, Emodi said. "Call centers have plenty of aggressive tones." Over the years, the training sample grew to 32,000 clips.
The potential applications are limited only by imagination, he added. Contact centers, market research firms, dating sites, search services, and many other businesses could benefit from this kind of technology.
I asked about law enforcement, and Emodi said the software has caught the attention of Homeland Security officials. Another audience member asked about healthcare applications, where the software could detect confusion in patients suffering from early stages of dementia, for example.
"We've been contacted by healthcare companies," Emodi said. "When we started this, we never even thought about healthcare."
Beyond Verbal's analytics happen in the cloud, so individuals can try it on the website or via the Moodies iOS app.
I gave it a shot this morning, and as you can see in the screenshot, it seems to have me pegged. "It helps us get a better understanding of our own selves," Emodi said.
What do you think, members? Do you see intonation analytics changing the way we interact with machines? What other applications might benefit from intonation analytics? Make your voice heard (but watch your tone) below.
— Michael Steinhart, , Executive Editor, AllAnalytics.com