We're learning new ways all the time too, like recent breakthroughs by Realeyes that allows for tracking of a person's emotional state based on a number of metrics related to their facial expression.
It's amazing technology that opens up a lot of interesting doors for improving customer satisfaction, judging the effectiveness of a show or event, or finding a deeper understanding for kinesiology.
The technique is so cheap and easy now, we're told, that it will not only offer deeper insight, but also replace many traditional methods for gauging public interest in something, such as surveys and focus groups. People will still need to be taken aside and shown a product or service, but with emotional analytics, the suggestion is, that we don't need to ask a person what they thought, we can just track what they felt.
We're also told that the accuracy of this system for predicting long-term sales is around 75 percent, 10 percent more than traditional methods. Right now its main use is in gauging interest in online ads, with a webcam feed providing the necessary close up video to analyze, but the scope is there for much much more.
As we saw with our look at predictive policing though, there are some concerns that emotion tracking like this might be too cold. The danger with any data-driven versus human-driven analysis, is that it can skip over important factors that might otherwise be obvious.
For starters, there are the shortcomings in the technology. As it stands, it is based almost entirely on facial expressions, which are only a part of someone's emotional state. Full-body language may have its own tells and while the existing technology has done a good job in predicting sales, it's not perfect and has seen better results in predicting engagement on social media – a notoriously low-effort, low-impact activity.
It is also a very skin deep analysis. While certainly there is a lot to be learned by the expressions we make in terms of predicting our mood in that moment, tracking facial expressions doesn't go very deep. It doesn't explore the reason for that emotional reaction, or what the person is thinking behind the autonomous movements of the facial muscles.
With that in mind, it would be interesting to see a study completed where tandem results of emotional analytics and traditional survey information is combined to provide a deeper insight into an individual. I wonder too though, whether having a real human with insight into human behavior could also analyze some of the footage to discover more intuitive meanings than the placement of certain facial features can in an algorithm.
Unfortunately this may not happen any time soon, as one of the big features of this automated facial analytics is the cost savings involved. Without having to hire someone to write questions, capture footage or look through it, the whole process is cheaper and easier.
There is certainly a place for it, as there is with all types of analytics, but do you think that this is too good a technology to resist using without companion studies to achieve more accurate results in a broader sense? Or will analytics firms take it for what it is: a tool in a varied arsenal?