1,426 research outputs found
Fair and Flexible?! Explanations Can Improve Applicant Reactions Toward Asynchronous Video Interviews
Asynchronous video interviews are used more and more for the preselection of potential job candidates. However, recent research has shown that they are less accepted by applicants than face-to-face interviews. Our study aimed to identify ways to improve perceptions of video interviews by using explanations that emphasize standardization and flexibility. Our results showed that an explanation stressing the higher level of standardization improved fairness perceptions, whereas an explanation stressing the flexibility concerning interview scheduling improved perceptions of usability. Additionally, the improvement of fairness perceptions eventually influenced perceived organizational attractiveness. Furthermore, older participants accepted video interviews less. Practical implications and recommendations for future research are discussed
Do ethnic, migration-based, and regional language varieties put applicants at a disadvantage? A meta-analysis of biases in personnel selection
This meta-analysis examined biases in personnel selection owing to applicants' use of non-standard language such as ethnic and migration-based language varieties or regional dialects. The analysis summarized the results of 22 studies with a total N of 3615 raters that compared applicants with an accent or dialect with applicants speaking standard language. The primary studies used different standard and non-standard languages and assessed different dependent variables related to hiring decisions in job interviews. The k = 109 effect sizes (Hedges' g) were assigned to the dependent variables of competence, warmth, and hirability. Non-standard speakers were rated as less competent (δ = −0.70), less warm (δ = −0.17), and less hirable (δ = −0.51) compared to standard speakers. Thus, at the same level of competence, non-standard speakers are rated lower than standard speakers and might, therefore, be disadvantaged in personnel selection contexts. We also considered several potential moderator variables (e.g., applicants' specific language variety, raters' own use of non-standard language, and raters' background) but only found rather limited support for them. Furthermore, publication bias had only limited effects. Practical implications for personnel selection are discussed
The relationship between the ability to identify evaluation criteria and integrity test scores
It has been argued that applicants who have the ability to identify what kind of behavior is evaluated positively in a personnel selection situation can use this information to adapt their behavior accordingly. Although this idea has been tested for assessment centers and structured interviews, it has not been studied with regard to integrity tests (or other personality tests). Therefore, this study tested whether candidates’ ability to identify evaluation criteria (ATIC) correlates with their integrity test scores. Candidates were tested in an application training setting (N = 92). The results supported the idea that ATIC also plays an important role for integrity tests. New directions for future research are suggested based on this finding
- …