49,717 research outputs found

    Matching Code and Law: Achieving Algorithmic Fairness with Optimal Transport

    Full text link
    Increasingly, discrimination by algorithms is perceived as a societal and legal problem. As a response, a number of criteria for implementing algorithmic fairness in machine learning have been developed in the literature. This paper proposes the Continuous Fairness Algorithm (CFAθ\theta) which enables a continuous interpolation between different fairness definitions. More specifically, we make three main contributions to the existing literature. First, our approach allows the decision maker to continuously vary between specific concepts of individual and group fairness. As a consequence, the algorithm enables the decision maker to adopt intermediate ``worldviews'' on the degree of discrimination encoded in algorithmic processes, adding nuance to the extreme cases of ``we're all equal'' (WAE) and ``what you see is what you get'' (WYSIWYG) proposed so far in the literature. Second, we use optimal transport theory, and specifically the concept of the barycenter, to maximize decision maker utility under the chosen fairness constraints. Third, the algorithm is able to handle cases of intersectionality, i.e., of multi-dimensional discrimination of certain groups on grounds of several criteria. We discuss three main examples (credit applications; college admissions; insurance contracts) and map out the legal and policy implications of our approach. The explicit formalization of the trade-off between individual and group fairness allows this post-processing approach to be tailored to different situational contexts in which one or the other fairness criterion may take precedence. Finally, we evaluate our model experimentally.Comment: Vastly extended new version, now including computational experiment

    Autonomous clustering using rough set theory

    Get PDF
    This paper proposes a clustering technique that minimises the need for subjective human intervention and is based on elements of rough set theory. The proposed algorithm is unified in its approach to clustering and makes use of both local and global data properties to obtain clustering solutions. It handles single-type and mixed attribute data sets with ease and results from three data sets of single and mixed attribute types are used to illustrate the technique and establish its efficiency

    European Union regulations on algorithmic decision-making and a "right to explanation"

    Get PDF
    We summarize the potential impact that the European Union's new General Data Protection Regulation will have on the routine use of machine learning algorithms. Slated to take effect as law across the EU in 2018, it will restrict automated individual decision-making (that is, algorithms that make decisions based on user-level predictors) which "significantly affect" users. The law will also effectively create a "right to explanation," whereby a user can ask for an explanation of an algorithmic decision that was made about them. We argue that while this law will pose large challenges for industry, it highlights opportunities for computer scientists to take the lead in designing algorithms and evaluation frameworks which avoid discrimination and enable explanation.Comment: presented at 2016 ICML Workshop on Human Interpretability in Machine Learning (WHI 2016), New York, N

    A Top-Down Approach to Managing Variability in Robotics Algorithms

    Full text link
    One of the defining features of the field of robotics is its breadth and heterogeneity. Unfortunately, despite the availability of several robotics middleware services, robotics software still fails to smoothly handle at least two kinds of variability: algorithmic variability and lower-level variability. The consequence is that implementations of algorithms are hard to understand and impacted by changes to lower-level details such as the choice or configuration of sensors or actuators. Moreover, when several algorithms or algorithmic variants are available it is difficult to compare and combine them. In order to alleviate these problems we propose a top-down approach to express and implement robotics algorithms and families of algorithms so that they are both less dependent on lower-level details and easier to understand and combine. This approach goes top-down from the algorithms and shields them from lower-level details by introducing very high level abstractions atop the intermediate abstractions of robotics middleware. This approach is illustrated on 7 variants of the Bug family that were implemented using both laser and infra-red sensors.Comment: 6 pages, 5 figures, Presented at DSLRob 2013 (arXiv:cs/1312.5952

    A Computable Economist’s Perspective on Computational Complexity

    Get PDF
    A computable economist's view of the world of computational complexity theory is described. This means the model of computation underpinning theories of computational complexity plays a central role. The emergence of computational complexity theories from diverse traditions is emphasised. The unifications that emerged in the modern era was codified by means of the notions of efficiency of computations, non-deterministic computations, completeness, reducibility and verifiability - all three of the latter concepts had their origins on what may be called 'Post's Program of Research for Higher Recursion Theory'. Approximations, computations and constructions are also emphasised. The recent real model of computation as a basis for studying computational complexity in the domain of the reals is also presented and discussed, albeit critically. A brief sceptical section on algorithmic complexity theory is included in an appendix

    Search and witness problems in group theory

    Get PDF
    Decision problems are problems of the following nature: given a property P and an object O, find out whether or not the object O has the property P. On the other hand, witness problems are: given a property P and an object O with the property P, find a proof of the fact that O indeed has the property P. On the third hand(?!), search problems are of the following nature: given a property P and an object O with the property P, find something "material" establishing the property P; for example, given two conjugate elements of a group, find a conjugator. In this survey our focus is on various search problems in group theory, including the word search problem, the subgroup membership search problem, the conjugacy search problem, and others

    Computable Rationality, NUTS, and the Nuclear Leviathan

    Get PDF
    This paper explores how the Leviathan that projects power through nuclear arms exercises a unique nuclearized sovereignty. In the case of nuclear superpowers, this sovereignty extends to wielding the power to destroy human civilization as we know it across the globe. Nuclearized sovereignty depends on a hybrid form of power encompassing human decision-makers in a hierarchical chain of command, and all of the technical and computerized functions necessary to maintain command and control at every moment of the sovereign's existence: this sovereign power cannot sleep. This article analyzes how the form of rationality that informs this hybrid exercise of power historically developed to be computable. By definition, computable rationality must be able to function without any intelligible grasp of the context or the comprehensive significance of decision-making outcomes. Thus, maintaining nuclearized sovereignty necessarily must be able to execute momentous life and death decisions without the type of sentience we usually associate with ethical individual and collective decisions

    Beyond opening up the black box: Investigating the role of algorithmic systems in Wikipedian organizational culture

    Full text link
    Scholars and practitioners across domains are increasingly concerned with algorithmic transparency and opacity, interrogating the values and assumptions embedded in automated, black-boxed systems, particularly in user-generated content platforms. I report from an ethnography of infrastructure in Wikipedia to discuss an often understudied aspect of this topic: the local, contextual, learned expertise involved in participating in a highly automated social-technical environment. Today, the organizational culture of Wikipedia is deeply intertwined with various data-driven algorithmic systems, which Wikipedians rely on to help manage and govern the "anyone can edit" encyclopedia at a massive scale. These bots, scripts, tools, plugins, and dashboards make Wikipedia more efficient for those who know how to work with them, but like all organizational culture, newcomers must learn them if they want to fully participate. I illustrate how cultural and organizational expertise is enacted around algorithmic agents by discussing two autoethnographic vignettes, which relate my personal experience as a veteran in Wikipedia. I present thick descriptions of how governance and gatekeeping practices are articulated through and in alignment with these automated infrastructures. Over the past 15 years, Wikipedian veterans and administrators have made specific decisions to support administrative and editorial workflows with automation in particular ways and not others. I use these cases of Wikipedia's bot-supported bureaucracy to discuss several issues in the fields of critical algorithms studies, critical data studies, and fairness, accountability, and transparency in machine learning -- most principally arguing that scholarship and practice must go beyond trying to "open up the black box" of such systems and also examine sociocultural processes like newcomer socialization.Comment: 14 pages, typo fixed in v

    Taste and the algorithm

    Get PDF
    Today, a consistent part of our everyday interaction with art and aesthetic artefacts occurs through digital media, and our preferences and choices are systematically tracked and analyzed by algorithms in ways that are far from transparent. Our consumption is constantly documented, and then, we are fed back through tailored information. We are therefore witnessing the emergence of a complex interrelation between our aesthetic choices, their digital elaboration, and also the production of content and the dynamics of creative processes. All are involved in a process of mutual influences, and are partially determined by the invisible guiding hand of algorithms. With regard to this topic, this paper will introduce some key issues concerning the role of algorithms in aesthetic domains, such as taste detection and formation, cultural consumption and production, and showing how aesthetics can contribute to the ongoing debate about the impact of today’s “algorithmic culture”

    On the connection between Nonstandard Analysis and Constructive Analysis

    Get PDF
    Constructive Analysis and Nonstandard Analysis are often characterized as completely antipodal approaches to analysis. We discuss the possibility of capturing the central notion of Constructive Analysis (i.e. algorithm, finite procedure or explicit construction) by a simple concept inside Nonstandard Analysis. To this end, we introduce Omega-invariance and argue that it partially satisfies our goal. Our results provide a dual approach to Erik Palmgren's development of Nonstandard Analysis inside constructive mathematics
    • …
    corecore