At last week's Sentiment Analysis Symposium in New York, Jacob Whitehill, a research scientist with Emotient demonstrated the company's emotion recognition products. He showed how they isolate the faces in a video stream and track their expressions, from joyful to angry to sad. "It has many commercial applications," he said.
Emotient provides an API that enables "real-time emotional analysis," and offers highly accurate readings of positive, negative, and neutral emotions based on "cognitive science, machine learning, and computer vision," Whitehill said. Mining large datasets of facial expressions, it can find patterns and sometimes even predict the way people will react to given stimuli. In addition to wide smiles and angry nostril flares, the software detects "microexpressions" like flashes of disgust or contempt.
According to Emotient's website, the API measures 28 facial action units, including eyebrow raises, nose wrinkles, lip curls, and jaw drops.
The obvious use case is for focus groups, with the software noting positive and negative reactions far more quickly and comprehensively than human observers. In research for consumer packaged goods, Whitehill said, facial analysis was a more accurate predictor of "proclivity to buy" than self-reporting by the subjects. It wasn't so much that certain package designs evoked huge smiles. "Lack of negative reaction was a strong predictor."
In audience response to ads, Emotient finds interesting differences and similarities among male and female viewers. This bears out in an analysis of the Volkswagen "Wings" commercial that aired during this year's Super Bowl. The ad, in which German engineers sprout wings when a car reaches 100,000 miles, drew smiles from women during the elevator scene and smirks from males during the bathroom scene.
Emotient superimposes an emotion waveform over the video stimulus. Whitehill said this "enables a fine-grained temporal analysis that would be much harder to do with human annotators."
A competing facial recognition provider, Affectiva, provides an interactive demo of this technology. It watches you through your webcam as you view a series of ads. The output seems similar to Emotient's.
Agitation and engagement
Whitehill said the consumer research firm Innerscope used Emotient software during the Super Bowl to observe a focus group divided among Denver Broncos and Seattle Seahawks fans. "The Denver fans grew more agitated during the game" -- not surprising to any human in the audience, but still noteworthy because the software detected the emotional shift.
In another example, Emotient was used to observe students in a classroom setting. "The engagement level of students during class had a very high correlation with test results." (Our own Ariella Brown wrote about this phenomenon in December.)
Stop staring at me
The emotion recognition industry is clearly in an upswing. Just last week, Emotient announced a $6 million round of funding and debuted a Google Glass app for measuring the sentiment of people within the wearer's field of vision. The first target for this app is the retail industry. "Salespeople who wear Glass can use it to measure how customers respond during their interactions and then get feedback that can help tailor their responses," Ingrid Lundgren wrote for TechCrunch. This application is "particularly aimed at training for future situations, but also for real-time feedback."
I don't know about you, but I was OK with the idea of being observed in tightly defined and controlled situations. I was even amused by the accuracy of emotion detection algorithms -- until I read about this Glass project. As valuable as this kind of technology may be for training sales folks (or, for another example, helping individuals with social communication deficits), it's a little too creepy to think that anyone with a Glass headset may be applying analytics to my face.
What do you think? Share a smile or scowl -- verbally -- in our comments.
— Michael Steinhart, , Executive Editor, AllAnalytics.com