EN
24 May, 2023

The Validity of Facial Emotion Recognition Technologies: The Impact of Emotion AI on Human Behavioral Research

Today’s episode features a Q&A with our own Graham Page. Graham leads the Media Analytics business Unit as Global Managing Director of Media Analytics at Affectiva, a Smart Eye company. He pioneered the integration of biometric and behavioral measures to mainstream brand and advertising research for 26 years as Executive VP and Head of Global Research Solutions at Kantar.

Over the course of the last year or so, there has been a thread of debate in the media regarding the validity and ethics of facial emotion recognition.  This has often reflected the point of view of some data privacy groups who are concerned about the use of facial technologies across several use cases, or the opinions of commercial interests who offer alternative biometric technologies, or traditional research methodologies.

Scrutiny of emerging technologies is vital, and the concerns raised are important points for debate. Affectiva has led the development of the Emotion AI field for over a decade, and the use of automated facial expression analysis in particular. Listen in to learn more.

 

Links of interest:

– [Podcast Episode] Lisa Feldman Barrett on Challenges in Inferring Emotion from Human Facial Movement

– [Blog] Face Value: The Power of Facial Signals in Human Behavioral Research

 

Additional Sources Referenced: 

[1] Barrett, Lisa Feldman, et al. “Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements.” Psychological science in the public interest 20.1 (2019): 1-68.

[2] Ekman, Paul, and Wallace V. Friesen. “Facial action coding system.” Environmental Psychology & Nonverbal Behavior (1978).

[3] Rosenberg, Erika L., and Paul Ekman, eds. What the face reveals: Basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS). Oxford University Press, 2020.

[4] Martinez, Brais, et al. “Automatic analysis of facial actions: A survey.” IEEE transactions on affective computing 10.3 (2017): 325-347.

[5] McDuff, Daniel, et al. “AFFDEX SDK: a cross-platform real-time multi-face expression recognition toolkit.” Proceedings of the 2016 CHI conference extended abstracts on human factors in computing systems. 2016.

[6] Bishay, Mina, et al. “AFFDEX 2.0: A Real-Time Facial Expression Analysis Toolkit.” arXiv preprint arXiv:2202.12059 (2022). Accepted at the FG2023 conference.

[7] McDuff, Daniel, et al. “Predicting ad liking and purchase intent: Large-scale analysis of facial responses to ads.” IEEE Transactions on Affective Computing 6.3 (2014): 223-235.

[8] Koldra, Evan, et al. Do emotions in advertising drive sales? https://ana.esomar.org/documents/do-emotions-in-advertising-drive-sales–8059.

[9] McDuff, Daniel, and Rana El Kaliouby. “Applications of automated facial coding in media measurement.” IEEE transactions on affective computing 8.2 (2016): 148-160.

[10] Teixeira, Thales, Rosalind Picard, and Rana El Kaliouby. “Why, when, and how much to entertain consumers in advertisements? A web-based facial tracking field study.” Marketing Science 33.6 (2014): 809-827.

[11] McDuff, Daniel, et al. “Automatic measurement of ad preferences from facial responses gathered

Back to top