The idea that you can read someone’s emotions from their facial expressions is very enticing. Whilst facial coding looks promising on the surface, the science just doesn’t support it. This article explains why facial coding technology and ‘insights’ do not stand up to scrutiny. What is facial coding in the context of market research? Facial coding is technology that detects movement of facial muscles, links facial movement to detect expressions, and codes those expressions as ‘emotions’. Many of these technologies make promises about adding an ‘extra layer of insight’ and claim to quantify consumer emotions evoked by concepts, adverts, animatics and other imagery / video stimulus. What’s the origin of thinking around facial coding? Since Charles Darwin’s 1872 book ‘The Expression of the Emotions in Man and Animals’, there has been a theory that human emotional expression is to a certain extent ‘universal’. Darwin used images of expression to see if there was consistency in recognition across his small sample of respondents. These are the Darwin / Duchenne facial expression images1, which were triggered by electrical stimulation and used for the early explorations. This early, rudimentary analysis concluded that there must be a ‘core set’ of universal emotions and many psychologists agreed that certain emotions are indeed universal to all humans, regardless of culture. For example; anger, fear, surprise, disgust, happiness and sadness. And, at that time, Darwin worked on the assumption that facial expressions are a reliable and universal indicator of these emotions. The images and techniques he used then, still prove useful today by physicians to understand the ability to recognise facial coding when assessing patients for Autism / Schizophrenia. The evolution of facial coding Jumping forward to today, there has been a rapid increase in the development of biometric technologies. These include technologies that are designed to measure elements of our physiological response (heart rate, gait, posture, muscle contraction, etc.) – demonstrating that there is a lot of interest in and potential profit to be made from trying to measure the human experience. In market research and marketing, it has long been known that understanding someone’s emotional responses, biases and schemas is core to being able to provide insights which are pertinent and can create actionable recommendations for our clients. So it is natural that there has been interest in using biometric technologies that claim to link the facial physiology and emotional response. The big question for researchers is: Can we reliably infer an emotional response, from someone’s facial expression? How confident are we that the science can prove than at expression can be simply ‘ facial coded’ and then corresponds to an emotional response? Well, the science behind shows us that this connection is not proven – no matter how exciting the technology. The grounding of much of the science is based on Basic Emotion Theory (BET) (as developed by Paul Ekman in the 1970’s3). The principle behind this built on Darwin’s theory that we have a limited number of basic emotion faculties that we are born with. This suggests that there is some sort of heritable, physiological basis for emotions (e.g. neural pathways / brain features) and that one of the manifestations of each of these basic emotions is a prototypical facial expression. Over the years, this theory has become increasingly contested. Particularly as later technology (such as fMRI etc.) has proven that the emotions do not occupy a specific neural network, nor are they linked to set physiological brain features. But this is often hard to fathom as we’ve been taught for years that we have emotions, and we express them through our faces. Every day we look at others and know how that person may be feeling, we relate to people in movies and empathize with others based on their expressions. The problem is, as Einstein pointed out “Everything should be made as simple as possible, but not simpler.” As the science developed, it’s become clear that it’s just not that simple4. The next important development to consider is The Theory of Constructed Emotion (TCE) Lisa has developed decades of evidence and studied emotions across cultures. She states that there are no ‘basic’ emotions and consequently no physiological manifestations in the face, body or brain that would allow you to accurately predict the emotions of others. This theory dictates that the process by which we arrive at emotions is much more complicated and based on a combination of external and sensory inputs, our core affect (how we’re feeling at the time), the situational context and our memory of past experiences. The theory of constructed emotion states that you can’t access emotions via measuring physical changes in the body – there are no meaningful associations between neurophysiological parameters and the complex mental activities that motivate things like prescribing or purchasing behaviour. We don’t read emotions like words on a page. Emotions do not ‘live’ in your face and body. A human has to connect them to the context to make them meaningful. We therefore have more control over our emotions than we think we do. With so much interest in trying to understand the hidden emotional world, it’s not surprising there is high investment in trying to make that happen. However, most recently, in 2022, the Information Commissioners Office (ICO) has contributed to this debate – sharing a damning indictment of the applications of emotional AI / biometric facial coding style technologies7. As a respected, government linked, independent body, their perspective holds a lot of weight – as many ‘emotion AI’ / facial coding specialists only have self-funded white papers and not peer reviewed supporting data to substantiate their claims. The following quotation from Stephen Bonner (ICO Deputy Commissioner) is a powerful indictment of the application of these solutions in attempting to measure, and even predict, emotion – “We’re saying to organisations, it’s silly, don’t do it, it’s not that these technologies are immature, it’s that they will never work, from what we hear from scientists, it’s based on a false premise of how people work.” They refer to a range of scientific findings and suggest it’s a ‘tempting possibility’ to see into the heads of others and that ‘we’d all love it if there were technologies that could solve some of these big difficult questions’ but ultimately ‘all the scientists we talk to tell us this doesn’t work. And it’s not that it’s a difficult technology that’s going to be cracked one day, it’s that there’s no meaningful link between our inner emotional state and the expression of that on our faces. So, technologies that claim to be able to infer or understand our emotions are fundamentally flawed” In short, the science suggests we cannot reliably infer an emotional response from a facial expression or facial coding, which means that it won’t add value to research, because it can’t allow us to more deeply understand what’s going on in someone’s mind. So how do we access the underlying motivations and drivers, the emotional responses and hidden clues behind our behaviour? The way we always have at HRW, by using bespoke and highly thoughtful research design backed by proven behavioural and data science*. We continue to closely follow developments in theory and technology concerning emotion, and where relevant will run our own self-funded studies to investigate their potential. If you’d like to find out more about facial coding and our own self funded studies, reach out to the team References: HRW Blog’s blog on Why Neuromarketing Tecniques can’t tell us if you’d rather have Pepsi or Coke Mmr perspective on Predicting Emotions from Facial Expressions Biometric data guidance: Biometric recognition | ICO biometrics-insight-report.pdf (ico.org.uk) Tech Life – ‘This is junk science’: The UK takes aim at biometric tech – BBC Sounds Facial Expressions Do Not Reveal Emotions | Scientific American Apply Now!