ESG Risk Management
EY Tax Lab // 2022
Summary
AI’s challenges with transparency and explainability have become engrained through all stages of the hiring process in the last decade. In this pilot study, we designed a mock interview experiment to quantify the impact of AI-driven facial emotion recognition. We wanted to know: Can those AI systems consistently and accurately measure emotions and objectively deduce behaviors from emotion-tracking data?
We conducted 9 remote mock interviews and analyzed the answers using an open source Facial Expression Recognition (FER) model on Python used for sentiment analysis of images and videos. We curated individualized analytics to understand the impact of AI emotion-tracking on video interviews and how such tools can be used for effective mock video interview preparation.
While facial recognition adds complexity and stress in interview settings, emotion-tracking outputs can be used for increased self awareness in behavioral interviews. We hope to empower people interviewed with AI and encourage transparency and helpful feedback loops from AI interview-prep companies.