Facial Analytics: What Are You Smiling at?

Some of us have our feelings written all over our faces. Others may pride themselves on being inscrutable. However, when a computer is analyzing our features frame by frame, it can glean insight from even the slightest quirk.

At last week's Sentiment Analysis Symposium in New York, Jacob Whitehill, a research scientist with Emotient demonstrated the company's emotion recognition products. He showed how they isolate the faces in a video stream and track their expressions, from joyful to angry to sad. "It has many commercial applications," he said.

Emotient provides an API that enables "real-time emotional analysis," and offers highly accurate readings of positive, negative, and neutral emotions based on "cognitive science, machine learning, and computer vision," Whitehill said. Mining large datasets of facial expressions, it can find patterns and sometimes even predict the way people will react to given stimuli. In addition to wide smiles and angry nostril flares, the software detects "microexpressions" like flashes of disgust or contempt.

Emotient analyzes 28 facial expressions to determine the subject's emotional range.
Emotient analyzes 28 facial expressions to determine the subject's emotional range.

According to Emotient's website, the API measures 28 facial action units, including eyebrow raises, nose wrinkles, lip curls, and jaw drops.

The obvious use case is for focus groups, with the software noting positive and negative reactions far more quickly and comprehensively than human observers. In research for consumer packaged goods, Whitehill said, facial analysis was a more accurate predictor of "proclivity to buy" than self-reporting by the subjects. It wasn't so much that certain package designs evoked huge smiles. "Lack of negative reaction was a strong predictor."

In audience response to ads, Emotient finds interesting differences and similarities among male and female viewers. This bears out in an analysis of the Volkswagen "Wings" commercial that aired during this year's Super Bowl. The ad, in which German engineers sprout wings when a car reaches 100,000 miles, drew smiles from women during the elevator scene and smirks from males during the bathroom scene.

Emotient superimposes an emotion waveform over the video stimulus. Whitehill said this "enables a fine-grained temporal analysis that would be much harder to do with human annotators."

Affectiva's Affdex (shown) and Emotient layer emotional responses across a video timeline to pinpoint which moments draw which reactions.
Affectiva's Affdex (shown) and Emotient layer emotional responses across a video timeline to pinpoint which moments draw which reactions.

A competing facial recognition provider, Affectiva, provides an interactive demo of this technology. It watches you through your webcam as you view a series of ads. The output seems similar to Emotient's.

Agitation and engagement
Whitehill said the consumer research firm Innerscope used Emotient software during the Super Bowl to observe a focus group divided among Denver Broncos and Seattle Seahawks fans. "The Denver fans grew more agitated during the game" -- not surprising to any human in the audience, but still noteworthy because the software detected the emotional shift.

In another example, Emotient was used to observe students in a classroom setting. "The engagement level of students during class had a very high correlation with test results." (Our own Ariella Brown wrote about this phenomenon in December.)

Stop staring at me
The emotion recognition industry is clearly in an upswing. Just last week, Emotient announced a $6 million round of funding and debuted a Google Glass app for measuring the sentiment of people within the wearer's field of vision. The first target for this app is the retail industry. "Salespeople who wear Glass can use it to measure how customers respond during their interactions and then get feedback that can help tailor their responses," Ingrid Lundgren wrote for TechCrunch. This application is "particularly aimed at training for future situations, but also for real-time feedback."

I don't know about you, but I was OK with the idea of being observed in tightly defined and controlled situations. I was even amused by the accuracy of emotion detection algorithms -- until I read about this Glass project. As valuable as this kind of technology may be for training sales folks (or, for another example, helping individuals with social communication deficits), it's a little too creepy to think that anyone with a Glass headset may be applying analytics to my face.

What do you think? Share a smile or scowl -- verbally -- in our comments.

— Michael Steinhart, Circle me on Google+ Follow me on TwitterVisit my LinkedIn pageFriend me on Facebook, Executive Editor, AllAnalytics.com

Related posts:

Michael Steinhart, Contributing Editor

Michael Steinhart has been covering IT and business computing for 15 years, tracking the rising popularity of virtualization, unified fabric, high-performance computing, and cloud infrastructures. He is editor of The Enterprise Cloud Site, which won the Least Imaginative Site Name award in 2012, and he managed TheITPro.com, a community of IT professionals taking their first steps into cloud computing. From 2006 to 2012, Steinhart worked as an executive editor at Ziff Davis Enterprise, writing and managing research reports, whitepapers, case studies, magazine features, e-newsletters, blog posts, online videos, and podcasts. He also moderated and presented in dozens of webinars and virtual tradeshows. He got his start in IT journalism at CMP Media back in 1998, then moved to PC Magazine, managing the popular Solutions section and then covering business technology and consumer software. He holds a Bachelor of Arts degree in communications/journalism from Ramapo College of New Jersey.

Biosensors, Robots Replacing Human Guards

In prisons and other security settings, electronic devices are predicting and detecting anomalous behavior with high accuracy and reducing the need for human personnel.

Questioning the Ethics of Learning Analytics

Schools around the country are weighing the benefits and ethical questions of learning analytics.

A good idea
  • 3/10/2014 7:05:00 PM

Lack of negative reaction was a strong predictor.

Too bad this wasn't available back in 1958. Maybe it could have saved us from the Edsel debacle.

Re: A good idea
  • 3/10/2014 8:34:56 PM

That, and many other marketing disasters, I'm sure. But what about a company that follows similar practices (like maintaining a shroud of secrecy around new releases) and still makes money hand over fist? I'm talking about Apple, naturally. What does it do differently?

Re: A good idea
  • 3/11/2014 1:02:23 AM

Apple may be secretive, but they don't have a head-in-the-sand approach to marketing.

One of the challenges in technology fields is the rapid rates of change. A product that is state of the art today will be ho-hum in a short time. This forces all technology companies to listen to the scientists and engineers. Too often this gets out of hand and the engineers control everything. We've all seen ads for electronics or other high tech products that were just a list of gobblygook features. These ads were written in techno speak by engineers.

Apple's success is using technology to make products that your 3-year old and your grandmother can use. Products with features that regular people can understand.

Re: A good idea
  • 3/11/2014 8:23:22 AM

It's funny - I have nothing but problems trying to use iProducts. It's probably based on prejudice more than anything else. I wonder how that would register on my face...

Re: A good idea
  • 3/11/2014 9:44:10 AM

I agree. I wonder what would register with this product? Maybe Apple could use ideas like this for the design of future products. I suspect Steve Jobs was able to do the same thing inately.

Re: A good idea
  • 3/11/2014 10:39:42 AM

Liking a product is not the same as liking a commercial.

Facial analytics is perfect for analyzing the reaction to a Super Bowl commercial.  Reactions can change second-by-second and you can see this happen as people watch the commercial.  If you analyze enough people you can even do a demographic analysis to make sure there aren't some negatives somewhere that you've missed.

But commercials have your full attention and only last seconds or a couple of minutes at most. Products are a much longer term relationship. Many times when you're interacting with a product, the expression on your face will be driven by something other than the product.

Smiling while grimacing
  • 3/11/2014 1:51:34 PM

I have always found this type of analytics fascinating, especially when you compare a person's facial expression and their actual behavior. It allows the determination of performance behavior versus true feelings. In my early research days we used observation research to watch shoppers and then analyze their answers to questions the disparity was incredible. Much of it was attributed to social performance ques. This may help companies determine some true feeling about products and services rather than socially acceptable answers.

goog glass app
  • 3/11/2014 2:03:20 PM

this will revolutionize speed dating

Re: goog glass app
  • 3/11/2014 2:09:15 PM

Maybe, but there is more to it than just looks.

Re: A good idea
  • 3/11/2014 2:23:03 PM

It's an amazing feat to achieve the degree of accuracy they claim. Definitely applicable to marketing efforts for starters. But I just can't see how they can effectively correlate the perceived expression to a specific item or point of emphasis in a scene. If you have two or more individuals looking at a scene at the simultaneously, they don't necessarily observe it identically. Their point of attraction may focus on different aspects of the scene. The measurements may be accurate but to what is it related specifically? Another question pertains to cultural consideration in interpretation of facial expressions. Was that accounted for?

Page 1 / 3   >   >>