20 research outputs found

    Bots, Seeds and People: Web Archives as Infrastructure

    Full text link
    The field of web archiving provides a unique mix of human and automated agents collaborating to achieve the preservation of the web. Centuries old theories of archival appraisal are being transplanted into the sociotechnical environment of the World Wide Web with varying degrees of success. The work of the archivist and bots in contact with the material of the web present a distinctive and understudied CSCW shaped problem. To investigate this space we conducted semi-structured interviews with archivists and technologists who were directly involved in the selection of content from the web for archives. These semi-structured interviews identified thematic areas that inform the appraisal process in web archives, some of which are encoded in heuristics and algorithms. Making the infrastructure of web archives legible to the archivist, the automated agents and the future researcher is presented as a challenge to the CSCW and archival community

    Our Space: Being a Responsible Citizen of the Digital World

    Get PDF
    Our Space is a set of curricular materials designed to encourage high school students to reflect on the ethical dimensions of their participation in new media environments. Through role-playing activities and reflective exercises, students are asked to consider the ethical responsibilities of other people, and whether and how they behave ethically themselves online. These issues are raised in relation to five core themes that are highly relevant online: identity, privacy, authorship and ownership, credibility, and participation.Our Space was co-developed by The Good Play Project and Project New Media Literacies (established at MIT and now housed at University of Southern California's Annenberg School for Communications and Journalism). The Our Space collaboration grew out of a shared interest in fostering ethical thinking and conduct among young people when exercising new media skills

    Bureaucracy as a Lens for Analyzing and Designing Algorithmic Systems

    Get PDF
    Scholarship on algorithms has drawn on the analogy between algorithmic systems and bureaucracies to diagnose shortcomings in algorithmic decision-making. We extend the analogy further by drawing on Michel Crozier’s theory of bureaucratic organizations to analyze the relationship between algorithmic and human decision-making power. We present algorithms as analogous to impartial bureaucratic rules for controlling action, and argue that discretionary decision-making power in algorithmic systems accumulates at locations where uncertainty about the operation of algorithms persists. This key point of our essay connects with Alkhatib and Bernstein’s theory of ’street-level algorithms’, and highlights that the role of human discretion in algorithmic systems is to accommodate uncertain situations which inflexible algorithms cannot handle. We conclude by discussing how the analysis and design of algorithmic systems could seek to identify and cultivate important sources of uncertainty, to enable the human discretionary work that enhances systemic resilience in the face of algorithmic errors.Peer reviewe

    Algorithmic Recommendations and Synaptic Functions

    No full text
    Personalized recommendation is the new marketing. Nick Seaver explains how ‘collaborative filtering’ de- fines people through their purchases

    PARKS AND RECOMMENDATION: SPATIAL IMAGINARIES IN ALGORITHMIC SYSTEMS

    Get PDF
    Algorithmic recommendation systems are designed to aid users in their navigation of large catalogs of media, such as songs or movies. Among the developers of these systems, those catalogs are commonly referred to as constituting or occupying “spaces” — the "music space,” for example, might be the set of all music available to stream on Spotify, organized such that similar songs are near each other. The production of this space occupies much of the time of engineers who work on these systems, and although a mathematically defined space may sound neutral or objective with regard to the objects located in it, this work requires effectively arbitrary choices that are shaped by subjective interpretations, which in turn shape the spaces thus produced. In this paper, I draw on ethnographic fieldwork with the developers of algorithmic music recommender systems to describe some of the ways they make sense of this decision-making work amongst themselves. A common theme emerges in the use of landscape and agricultural metaphors: the makers of these systems describe themselves as “data gardeners,” tending to algorithmic outputs, or as “park rangers,” maintaining the grounds and helping visitors find their way. I argue that this imagery provides a middle route through two extreme positions regarding the origins of the “music space” — that it is an objectively discovered cultural order or that it is an interpretive invention of engineers. The language of landscape and agriculture places their work instead at the interface of the natural, cultural, and technical

    Algorithms as culture: Some tactics for the ethnography of algorithmic systems

    No full text
    This article responds to recent debates in critical algorithm studies about the significance of the term “algorithm.” Where some have suggested that critical scholars should align their use of the term with its common definition in professional computer science, I argue that we should instead approach algorithms as “multiples”—unstable objects that are enacted through the varied practices that people use to engage with them, including the practices of “outsider” researchers. This approach builds on the work of Laura Devendorf, Elizabeth Goodman, and Annemarie Mol. Different ways of enacting algorithms foreground certain issues while occluding others: computer scientists enact algorithms as conceptual objects indifferent to implementation details, while calls for accountability enact algorithms as closed boxes to be opened. I propose that critical researchers might seek to enact algorithms ethnographically, seeing them as heterogeneous and diffuse sociotechnical systems, rather than rigidly constrained and procedural formulas. To do so, I suggest thinking of algorithms not “in” culture, as the event occasioning this essay was titled, but “as” culture: part of broad patterns of meaning and practice that can be engaged with empirically. I offer a set of practical tactics for the ethnographic enactment of algorithmic systems, which do not depend on pinning down a singular “algorithm” or achieving “access,” but which rather work from the partial and mobile position of an outsider

    “You Social Scientists Love Mind Games” : Experimenting in the “divide” between data science and critical algorithm studies

    No full text
    In recent years, many qualitative sociologists, anthropologists, and social theorists have critiqued the use of algorithms and other automated processes involved in data science on both epistemological and political grounds. Yet, it has proven difficult to bring these important insights into the practice of data science itself. We suggest that part of this problem has to do with under-examined or unacknowledged assumptions about the relationship between the two fields—ideas about how data science and its critics can and should relate. Inspired by recent work in Science and Technology Studies on interventions, we attempted to stage an encounter in which practicing data scientists were asked to analyze a corpus of critical social science literature about their work, using tools of textual analysis such as co-word and topic modelling. The idea was to provoke discussion both about the content of these texts and the possible limits of such analyses. In this commentary, we reflect on the planning stages of the experiment and how responses to the exercise, from both data scientists and qualitative social scientists, revealed some of the tensions and interactions between the normative positions of the different fields. We argue for further studies which can help us understand what these interdisciplinary tensions turn on—which do not paper over them but also do not take them as given

    Les piùges de l’attention: Tùque, no 2

    No full text
    Nick Seave
    corecore