100,833 research outputs found

    Do librarians have a shared set of values? A comparative study of 36 codes of ethics based on Gorman's Enduring Values

    Get PDF
    Thirty-six ethical codes from national professional associations were studied, the aim to test whether librarians have global shared values or if political and cultural contexts have significantly influenced the codes' content. Gorman's eight core values of stewardship, service, intellectual freedom, rationalism, literacy and learning, equity of access to recorded knowledge and information, privacy and democracy were used as a benchmark. A quantitative analysis was carried out of which values each code contained. The codes were further qualitatively analysed, to examine how each value was expressed. It was found that on average codes featured five of Gorman's eight values. The most popular values were: service, privacy, equity of access, stewardship and intellectual freedom. The least popular value was rationalism, across all codes. Some codes omitted certain values because of their specific focus, such as the Native American code. Codes varied in how values were expressed, for example some codes limited principles by law, while some did not. Expression of stewardship and democracy was found to be stronger in countries which have recently experienced conflict or colonialism. The relationship between the profession and the state was another area of variation. Countries in the Asia-Pacific put more emphasis on the power of the State

    Discrimination by Customers

    Get PDF
    Customers discriminate by race and gender, with considerable negative consequences for female and minority workers and business owners. Yet anti-discrimination laws apply only to discrimination by firms, not by customers. We examine efficacy and privacy reasons for why this may be so, as well as changing features of the market that, by blurring the line between firms and customers, make current law increasingly irrelevant. We conclude that, while there are reasons to be cautious about regulating customer behavior, those reasons do not justify acceding to customer discrimination altogether. To open a discussion of the regulatory options that take account of the most significant concerns, we offer a modest proposal. This proposal does not create a legal obligation on the part of customers themselves, but rather requires firms that already have nondiscrimination obligations to do more to reduce the occurrence, and consequences, of discrimination by customers

    Redescribing Health Privacy: The Importance of Health Policy

    Get PDF
    Current conversations about health information policy often tend to be based on three broad assumptions. First, many perceive a tension between regulation and innovation. We often hear that privacy regulations are keeping researchers, companies, and providers from aggregating the data they need to promote innovation. Second, aggregation of fragmented data is seen as a threat to its proper regulation, creating the risk of breaches and other misuse. Third, a prime directive for technicians and policymakers is to give patients ever more granular methods of control over data. This article questions and complicates those assumptions, which I deem (respectively) the Privacy Threat to Research, the Aggregation Threat to Privacy, and the Control Solution. This article is also intended to enrich our concepts of “fragmentation” and “integration” in health care. There is a good deal of sloganeering around “firewalls” and “vertical integration” as idealized implementations of “fragmentation” and “integration” (respective). The problem, though, is that terms like these (as well as “disruption”) are insufficiently normative to guide large-scale health system change. They describe, but they do not adequately prescribe. By examining those instances where: a) regulation promotes innovation, and b) increasing (some kinds of) availability of data actually enhances security, confidentiality, and privacy protections, this article attempts to give a richer account of the ethics of fragmentation and integration in the U.S. health care system. But, it also has a darker side, highlighting the inevitable conflicts of values created in a “reputation society” driven by stigmatizing social sorting systems. Personal data control may exacerbate social inequalities. Data aggregation may increase both our powers of research and our vulnerability to breach. The health data policymaking landscape of the next decade will feature a series of intractable conflicts between these important social values

    Slave to the Algorithm? Why a \u27Right to an Explanation\u27 Is Probably Not the Remedy You Are Looking For

    Get PDF
    Algorithms, particularly machine learning (ML) algorithms, are increasingly important to individuals’ lives, but have caused a range of concerns revolving mainly around unfairness, discrimination and opacity. Transparency in the form of a “right to an explanation” has emerged as a compellingly attractive remedy since it intuitively promises to open the algorithmic “black box” to promote challenge, redress, and hopefully heightened accountability. Amidst the general furore over algorithmic bias we describe, any remedy in a storm has looked attractive. However, we argue that a right to an explanation in the EU General Data Protection Regulation (GDPR) is unlikely to present a complete remedy to algorithmic harms, particularly in some of the core “algorithmic war stories” that have shaped recent attitudes in this domain. Firstly, the law is restrictive, unclear, or even paradoxical concerning when any explanation-related right can be triggered. Secondly, even navigating this, the legal conception of explanations as “meaningful information about the logic of processing” may not be provided by the kind of ML “explanations” computer scientists have developed, partially in response. ML explanations are restricted both by the type of explanation sought, the dimensionality of the domain and the type of user seeking an explanation. However, “subject-centric explanations (SCEs) focussing on particular regions of a model around a query show promise for interactive exploration, as do explanation systems based on learning a model from outside rather than taking it apart (pedagogical versus decompositional explanations) in dodging developers\u27 worries of intellectual property or trade secrets disclosure. Based on our analysis, we fear that the search for a “right to an explanation” in the GDPR may be at best distracting, and at worst nurture a new kind of “transparency fallacy.” But all is not lost. We argue that other parts of the GDPR related (i) to the right to erasure ( right to be forgotten ) and the right to data portability; and (ii) to privacy by design, Data Protection Impact Assessments and certification and privacy seals, may have the seeds we can use to make algorithms more responsible, explicable, and human-centered

    Challenges of web-based personal genomic data sharing

    Get PDF
    In order to study the relationship between genes and diseases, the increasing availability and sharing of phenotypic and genotypic data have been promoted as an imperative within the scientific community. In parallel with data sharing practices by clinicians and researchers, recent initiatives have been observed in which individuals are sharing personal genomic data. The involvement of individuals in such initiatives is facilitated by the increased accessibility of personal genomic data, offered by private test providers along with availability of online networks. Personal webpages and on-line data sharing platforms such as Consent to Research (Portable Legal Consent), Free the Data, and Genomes Unzipped are being utilized to host and share genotypes, electronic health records and family history uploaded by individuals. Although personal genomic data sharing initiatives vary in nature, the emphasis on the individuals’ control on their data in order to benefit research and ultimately health care has seen as a key theme across these initiatives. In line with the growing practice of personal genomic data sharing, this paper aims to shed light on the potential challenges surrounding these initiatives. As in the course of these initiatives individuals are solicited to individually balance the risks and benefits of sharing their genomic data, their awareness of the implications of personal genomic data sharing for themselves and their family members is a necessity. Furthermore, given the sensitivity of genomic data and the controversies around their complete de-identifiability, potential privacy risks and harms originating from unintended uses of data have to be taken into consideration

    The Right to Privacy and the Right to Use the Bathroom Consistent with One’s Gender Identity

    Get PDF
    Under hösten 2013 offentliggjorde den svenska regeringen att man vill investera i att bygga ut Stockholms tunnelbana. Denna utbyggnad innefattar nio nya stationer, Sofia på Södermalm är en av dessa stationer. På grund av platsens förutsättningar kommer stationen bli en av de djupaste i världen. Detta förslag utnyttjar dessa förutsättningar för att sammankoppla gatunivån med den underjordiska perrongen och ger den vardagliga upplevelsen av rymd, atmosfär, ljus och tomrum en central position. Förslaget innefattar ett hundra meter djupt schakt, två hisschakt, ett publikt torg park och tunnelbaneperrong. During the autumn of 2013 the Swedish government announced that they intend to invest in expanding the Stockholm Metro. The expansion includes nine new stations; Sofia on Södermalm is one of them. The conditions of the site make the station to one of the deepest in the world. This proposal makes use of these conditions to connect the street level and the underground platform level and give the everyday experience of space, atmosphere, light and void a central position. The proposal comprises a hundred meters deep shaft, two elevator shafts, a public square park and train platform.
    • …
    corecore