17 research outputs found
Fairness and Bias in Algorithmic Hiring
Employers are adopting algorithmic hiring technology throughout the
recruitment pipeline. Algorithmic fairness is especially applicable in this
domain due to its high stakes and structural inequalities. Unfortunately, most
work in this space provides partial treatment, often constrained by two
competing narratives, optimistically focused on replacing biased recruiter
decisions or pessimistically pointing to the automation of discrimination.
Whether, and more importantly what types of, algorithmic hiring can be less
biased and more beneficial to society than low-tech alternatives currently
remains unanswered, to the detriment of trustworthiness. This multidisciplinary
survey caters to practitioners and researchers with a balanced and integrated
coverage of systems, biases, measures, mitigation strategies, datasets, and
legal aspects of algorithmic hiring and fairness. Our work supports a
contextualized understanding and governance of this technology by highlighting
current opportunities and limitations, providing recommendations for future
work to ensure shared benefits for all stakeholders
Recommended from our members
My friends, editors, algorithms, and I: Examining audience attitudes to news selection
Prompted by the ongoing development of content personalization by social networks and mainstream news brands, and recent debates about balancing algorithmic and editorial selection, this study explores what audiences think about news selection mechanisms and why. Analysing data from a 26-country survey (N=53,314), we report the extent to which audiences believe story selection by editors and story selection by algorithms are good ways to get news online and, using multi-level models, explore the relationships that exist between individuals’ characteristics and those beliefs. The results show that, collectively, audiences believe algorithmic selection guided by a user’s past consumption behaviour is a better way to get news than editorial curation. There are, however, significant variations in these beliefs at the individual level. Age, trust in news, concerns about privacy, mobile news access, paying for news, and six other variables had effects. Our results are partly in line with current general theory on algorithmic appreciation, but diverge in our findings on the relative appreciation of algorithms and experts, and in how the appreciation of algorithms can differ according to the data that drive them. We believe this divergence is partly due to our study’s focus on news, showing algorithmic appreciation has context-specific characteristics
New Data Security Requirements and the Proceduralization of Mass Surveillance Law after the European Data Retention Case
This paper discusses the regulation of mass metadata surveillance in Europe through the lens of the landmark judgment in which the Court of Justice of the European Union struck down the Data Retention Directive. The controversial directive obliged telecom and Internet access providers in Europe to retain metadata of all their customers for intelligence and law enforcement purposes, for a period of up to two years. In the ruling, the Court declared the directive in violation of the human rights to privacy and data protection. The Court also confirmed that the mere collection of metadata interferes with the human right to privacy. In addition, the Court developed three new criteria for assessing the level of data security required from a human rights perspective: security measures should take into account the risk of unlawful access to data, and the data’s quantity and sensitivity. While organizations that campaigned against the directive have welcomed the ruling, we warn for the risk of proceduralization of mass surveillance law. The Court did not fully condemn mass surveillance that relies on metadata, but left open the possibility of mass surveillance if policymakers lay down sufficient procedural safeguards. Such proceduralization brings systematic risks for human rights. Government agencies, with ample resources, can design complicated systems of procedural oversight for mass surveillance - and claim that mass surveillance is lawful, even if it affects millions of innocent people
Does everyone have a price? Understanding people’s attitude towards online and offline price discrimination
Online stores can present a different price to each customer. Such algorithmic personalised pricing can lead to advanced forms of price discrimination based on the characteristics and behaviour of individual consumers. We conducted two consumer surveys among a representative sample of the Dutch population (N=1233 and N=1202), to analyse consumer attitudes towards a list of examples of price discrimination and dynamic pricing. A vast majority finds online price discrimination unfair and unacceptable, and thinks it should be banned. However, some pricing strategies that have been used by companies for decades are almost equally unpopular. We analyse the results to better understand why people dislike many types of price discrimination
The Right to Communications Confidentiality in Europe: Protecting Privacy, Freedom of Expression, and Trust
Short paper: initial recommendations for the design of privacy management tools for smartphones
The continuing rise in the popularity of smartphones has led to an accompanying rise in the exposure of users to privacy threats as in the case of unintended leakage of personal information from apps. To improve transparency and the ability of users to control data leakage, the design of privacy-enhancing tools aimed at reducing the burden of informed privacy-decisions should be grounded upon users’ tacit needs and preferences. To this end, the present study explores users’ personal perception and concerns toward privacy and their expectations. Initial recommendations include: (1) consideration of the preferences of users for preserving functionalities of their apps, informing users about both (2) the real benefits and actual possibility of using privacy management tools and (3) suspected applications’ data collection behaviours in a way that matches their real concerns and values.<br