2 research outputs found
Toward Gender-Inclusive Coreference Resolution
Correctly resolving textual mentions of people fundamentally entails making
inferences about those people. Such inferences raise the risk of systemic
biases in coreference resolution systems, including biases that can harm binary
and non-binary trans and cis stakeholders. To better understand such biases, we
foreground nuanced conceptualizations of gender from sociology and
sociolinguistics, and develop two new datasets for interrogating bias in crowd
annotations and in existing coreference resolution systems. Through these
studies, conducted on English text, we confirm that without acknowledging and
building systems that recognize the complexity of gender, we build systems that
lead to many potential harms.Comment: 28 pages; ACL versio