1 research outputs found
Leveraging Multimodal Behavioral Analytics for Automated Job Interview Performance Assessment and Feedback
Behavioral cues play a significant part in human communication and cognitive
perception. In most professional domains, employee recruitment policies are
framed such that both professional skills and personality traits are adequately
assessed. Hiring interviews are structured to evaluate expansively a potential
employee's suitability for the position - their professional qualifications,
interpersonal skills, ability to perform in critical and stressful situations,
in the presence of time and resource constraints, etc. Therefore, candidates
need to be aware of their positive and negative attributes and be mindful of
behavioral cues that might have adverse effects on their success. We propose a
multimodal analytical framework that analyzes the candidate in an interview
scenario and provides feedback for predefined labels such as engagement,
speaking rate, eye contact, etc. We perform a comprehensive analysis that
includes the interviewee's facial expressions, speech, and prosodic
information, using the video, audio, and text transcripts obtained from the
recorded interview. We use these multimodal data sources to construct a
composite representation, which is used for training machine learning
classifiers to predict the class labels. Such analysis is then used to provide
constructive feedback to the interviewee for their behavioral cues and body
language. Experimental validation showed that the proposed methodology achieved
promising results.Comment: 9 pages, ACL 202