1 research outputs found

    Using NLP to resolve mismatches between jobseekers and positions in recruitment

    Get PDF
    Recruiting through online portals has seen a dramatic increase in recent decades and it is challenging for job seekers to evaluate the overwhelming amount of data to efficiently identify positions that align with their skills and qualifications. This research addresses this issue by investigating automatic approaches that leverage recent developments in Natural Language Processing (NLP) that search, parse, and evaluate the often unstructured data in order to find appropriate matches. We present the development of a benchmark suite consisting of an annotation schema, training corpus and baseline model for Entity Recognition (ER) in job descriptions, published under a Creative Commons licence. The dataset contains 18.6k entities comprising five types: Skill; Qualification; Experience; Occupation; and Domain. We develop a benchmark Conditional Random Fields (CRF) ER model which achieves an F1 score of 0.59, and our best performing model utilises Bidirectional Encoder Representations from Transformers (BERT) and achieves an F1 score of 0.73. We consider different ways of framing the matching problem and develop Machine Learning (ML) models to address each. We propose that the Natural Language Inference (NLI) paradigm most closely aligns with the matching problem. Our best performing model utilises decomposable attention and achieves an F1 score of 0.73 on a job application success prediction task. Finally, we integrate the ER and success prediction models into a cohesive pipeline that predicts whether a given job application made by a user will be successful, which can be extended into a system that recommends suitable jobs to a user. Although we observe poorer results on this pipeline relative to a more simple input truncation approach, we suggest this may be limited by the ER component for feature selection and the entity encoding process
    corecore