1,072 research outputs found

    Sacred vs. Profane in The Great War: A Neutral’s Indictment

    Get PDF

    Moral Responsibility for Computing Artifacts: The Rules and Issues of Trust

    Get PDF
    “The Rules” are found in a collaborative document (started in March 2010) that states principles for responsibility when a computer artifact is designed, developed and deployed into a sociotechnical system. At this writing, over 50 people from nine countries have signed onto The Rules (Ad Hoc Committee, 2010). Unlike codes of ethics, The Rules are not tied to any organization, and computer users as well as computing professionals are invited to sign onto The Rules. The emphasis in The Rules is that both users and professionals have responsibilities in the production and use of computing artifacts. In this paper, we use The Rules to examine issues of trust

    Mimesis across Empires: Artworks and Networks in India, 1765-1860

    Get PDF
    Review of Mimesis across Empires: Artworks and Networks in India, 1765-1860, Reviewed January 2014 by Marty Miller, Art & Design Librarian, Louisiana State University, [email protected]

    A Sisterhood of Sculptors: American Artists in Nineteenth-Century Rome

    Get PDF
    Review of A Sisterhood of Sculptors: American Artists in Nineteenth-Century Rome, Reviewed January 2015 by Marty Miller, Art & Design Librarian, Middleton Library, Louisiana State University, [email protected]

    Crafting Lives: African American Artisans in New Bern, North Carolina, 1770-1900

    Get PDF
    Review of Crafting Lives: African American Artisans in New Bern, North Carolina, 1770-1900, Reviewed May 2014 by Marty Miller, Art and Design Librarian, LSU Libraries, [email protected]

    Ethical Issues in Open Source Software

    Get PDF
    In this essay we argue that the current social and ethical structure in the Open Source Software (OSS) Community stem from its roots in academia. The individual developers experience a level of autonomy similar to that of a faculty member. Furthermore, we assert that the Open Source Software Community\u27s social structure demands benevolent leadership. We argue that it is difficult to pass off low quality open source software as high quality software and that the Open Source development model offers strong accountability. Finally, we argue that Open Source Software introduces ethical challenges for universities and the software development community

    Why We Should Have Seen That Coming: Comments on Microsoft’s Tay “Experiment,” and Wider Implications

    Get PDF
    In this paper we examine the case of Tay, the Microsoft AI chatbot that was launched in March, 2016. After less than 24 hours, Microsoft shut down the experiment because the chatbot was generating tweets that were judged to be inappropriate since they included racist, sexist, and anti-Semitic language. We contend that the case of Tay illustrates a problem with the very nature of learning software (LS is a term that describes any software that changes its program in response to its interactions) that interacts directly with the public, and the developer’s role and responsibility associated with it. We make the case that when LS interacts directly with people or indirectly via social media, the developer has additional ethical responsibilities beyond those of standard software. There is an additional burden of care

    On the Responsibility for Uses of Downstream Software

    Get PDF
    In this paper we explore an issue that is different from whether developers are responsible for the direct impact of the software they write. We examine, instead, in what ways, and to what degree, developers are responsible for the way their software is used “downstream.” We review some key scholarship analyzing responsibility in computing ethics, including some recent work by Floridi. We use an adaptation of a mechanism developed by Floridi to argue that there are features of software that can be used as guides to better distinguish situations where a software developer might share in responsibility for the software’s downstream use from those in which the software developer likely does not share in that responsibility. We identify five such features and argue how they are useful in the model of responsibility that we develop. The features are: closeness to the hardware, risk, sensitivity of data, degree of control over or knowledge of the future population of users, and the nature of the software (general vs. special purpose)

    On Using Model For Downstream Responsibility

    Get PDF
    The authors identify features of software and the software development process that may contribute to the differences in the level of responsibility assigned to the software developers when they make their software available for others to use as a tool in building a second piece of software. They call this second use of the software downstream use

    When AI Moves Downstream

    Get PDF
    After computing professionals design, develop, and deploy software, what is their responsibility for subsequent uses of that software “downstream” by others? Furthermore, does it matter ethically if the software in question is considered to be artificial intelligent (AI)? The authors have previously developed a model to explore downstream accountability, called the Software Responsibility Attribution System (SRAS). In this paper, we explore three recent publications relevant to downstream accountability, and focus particularly on examples of AI software. Based on our understanding of the three papers, we suggest refinements of SRAS
    • …
    corecore