10,431 research outputs found

    Reviving the Public Trustee Concept and Applying It to Information Privacy Policy

    Get PDF

    Filtering, Piracy Surveillance and Disobedience

    Get PDF
    There has always been a cyclical relationship between the prevention of piracy and the protection of civil liberties. While civil liberties advocates previously warned about the aggressive nature of copyright protection initiatives, more recently, a number of major players in the music industry have eventually ceded to less direct forms of control over consumer behavior. As more aggressive forms of consumer control, like litigation, have receded, we have also seen a rise in more passive forms of consumer surveillance. Moreover, even as technology has developed more perfect means for filtering and surveillance over online piracy, a number of major players have opted in favor of “tolerated use,” a term coined by Professor Tim Wu to denote the allowance of uses that may be otherwise infringing, but that are allowed to exist for public use and enjoyment. Thus, while the eventual specter of copyright enforcement and monitoring remains a pervasive digital reality, the market may fuel a broad degree of consumer freedom through the toleration or taxation of certain kinds of activities. This Article is meant largely to address and to evaluate these shifts by drawing attention to the unique confluence of these two important moments: the growth of tolerated uses, coupled with an increasing trend towards more passive forms of piracy surveillance in light of the balance between copyright enforcement and civil liberties. The content industries may draw upon a broad definition of disobedience in their campaigns to educate the public about copyright law, but the market’s allowance of DRM-free content suggests an altogether different definition. The divide in turn between copyright enforcement and civil liberties results in a perfect storm of uncertainty, suggesting the development of an even further division between the role of the law and the role of the marketplace in copyright enforcement and innovation, respectively

    Surveillance, big data and democracy: lessons for Australia from the US and UK

    Get PDF
    This article argues that current laws are ill-equipped to deal with the multifaceted threats to individual privacy by governments, corporations and our own need to participate in the information society. Introduction In the era of big data, where people find themselves surveilled in ever more finely granulated aspects of their lives, and where the data profiles built from an accumulation of data gathered about themselves and others are used to predict as well as shape their behaviours, the question of privacy protection arises constantly. In this article we interrogate whether the discourse of privacy is sufficient to address this new paradigm of information flow and control. What we confront in this area is a set of practices concerning the collection, aggregation, sharing, interrogation and uses of data on a scale that crosses private and public boundaries, jurisdictional boundaries, and importantly, the boundaries between reality and simulation. The consequences of these practices are emerging as sometimes useful and sometimes damaging to governments, citizens and commercial organisations. Understanding how to regulate this sphere of activity to address the harms, to create an infrastructure of accountability, and to bring more transparency to the practices mentioned, is a challenge of some complexity. Using privacy frameworks may not provide the solutions or protections that ultimately are being sought. This article is concerned with data gathering and surveillance practices, by business and government, and the implications for individual privacy in the face of widespread collection and use of big data. We will firstly outline the practices around data and the issues that arise from such practices. We then consider how courts in the United Kingdom (‘UK’) and the United States (‘US’) are attempting to frame these issues using current legal frameworks, and finish by considering the Australian context. Notably the discourse around privacy protection differs significantly across these jurisdictions, encompassing elements of constitutional rights and freedoms, specific legislative schemes, data protection, anti-terrorist and criminal laws, tort and equity. This lack of a common understanding of what is or what should be encompassed within privacy makes it a very fragile creature indeed. On the basis of the exploration of these issues, we conclude that current laws are ill-equipped to deal with the multifaceted threats to individual privacy by governments, corporations and our own need to participate in the information society

    After the Gold Rush: The Boom of the Internet of Things, and the Busts of Data-Security and Privacy

    Get PDF
    This Article addresses the impact that the lack of oversight of the Internet of Things has on digital privacy. While the Internet of Things is but one vehicle for technological innovation, it has created a broad glimpse into domestic life, thus triggering several privacy issues that the law is attempting to keep pace with. What the Internet of Things can reveal is beyond the control of the individual, as it collects information about every practical aspect of an individual’s life, and provides essentially unfettered access into the mind of its users. This Article proposes that the federal government and the state governments bend toward consumer protection while creating a cogent and predictable body of law surrounding the Internet of Things. Through privacy-by-design or self-help, it is imperative that the Internet of Things—and any of its unforeseen progeny—develop with an eye toward safeguarding individual privacy while allowing technological development

    Regulating Mobile Mental Health Apps

    Get PDF
    Mobile medical apps (MMAs) are a fast‐growing category of software typically installed on personal smartphones and wearable devices. A subset of MMAs are aimed at helping consumers identify mental states and/or mental illnesses. Although this is a fledgling domain, there are already enough extant mental health MMAs both to suggest a typology and to detail some of the regulatory issues they pose. As to the former, the current generation of apps includes those that facilitate self‐assessment or self‐help, connect patients with online support groups, connect patients with therapists, or predict mental health issues. Regulatory concerns with these apps include their quality, safety, and data protection. Unfortunately, the regulatory frameworks that apply have failed to provide coherent risk‐assessment models. As a result, prudent providers will need to progress with caution when it comes to recommending apps to patients or relying on app‐generated data to guide treatment

    Internet Giants as Quasi-Governmental Actors and the Limits of Contractual Consent

    Get PDF
    Although the government’s data-mining program relied heavily on information and technology that the government received from private companies, relatively little of the public outrage generated by Edward Snowden’s revelations was directed at those private companies. We argue that the mystique of the Internet giants and the myth of contractual consent combine to mute criticisms that otherwise might be directed at the real data-mining masterminds. As a result, consumers are deemed to have consented to the use of their private information in ways that they would not agree to had they known the purposes to which their information would be put and the entities – including the federal government – with whom their information would be shared. We also call into question the distinction between governmental actors and private actors in this realm, as the Internet giants increasingly exploit contractual mechanisms to operate with quasi-governmental powers in their relations with consumers. As regulators and policymakers focus on how to better protect consumer data, we propose that solutions that rely upon consumer permission adopt a more exacting and limited concept of the consent required before private entities may collect or make use of consumer’s information where such uses touch upon privacy interests

    Student Privacy in Learning Analytics: An Information Ethics Perspective

    Get PDF
    In recent years, educational institutions have started using the tools of commercial data analytics in higher education. By gathering information about students as they navigate campus information systems, learning analytics “uses analytic techniques to help target instructional, curricular, and support resources” to examine student learning behaviors and change students’ learning environments. As a result, the information educators and educational institutions have at their disposal is no longer demarcated by course content and assessments, and old boundaries between information used for assessment and information about how students live and work are blurring. Our goal in this paper is to provide a systematic discussion of the ways in which privacy and learning analytics conflict and to provide a framework for understanding those conflicts. We argue that there are five crucial issues about student privacy that we must address in order to ensure that whatever the laudable goals and gains of learning analytics, they are commensurate with respecting students’ privacy and associated rights, including (but not limited to) autonomy interests. First, we argue that we must distinguish among different entities with respect to whom students have, or lack, privacy. Second, we argue that we need clear criteria for what information may justifiably be collected in the name of learning analytics. Third, we need to address whether purported consequences of learning analytics (e.g., better learning outcomes) are justified and what the distributions of those consequences are. Fourth, we argue that regardless of how robust the benefits of learning analytics turn out to be, students have important autonomy interests in how information about them is collected. Finally, we argue that it is an open question whether the goods that justify higher education are advanced by learning analytics, or whether collection of information actually runs counter to those goods

    Redescribing Health Privacy: The Importance of Health Policy

    Get PDF
    Current conversations about health information policy often tend to be based on three broad assumptions. First, many perceive a tension between regulation and innovation. We often hear that privacy regulations are keeping researchers, companies, and providers from aggregating the data they need to promote innovation. Second, aggregation of fragmented data is seen as a threat to its proper regulation, creating the risk of breaches and other misuse. Third, a prime directive for technicians and policymakers is to give patients ever more granular methods of control over data. This article questions and complicates those assumptions, which I deem (respectively) the Privacy Threat to Research, the Aggregation Threat to Privacy, and the Control Solution. This article is also intended to enrich our concepts of “fragmentation” and “integration” in health care. There is a good deal of sloganeering around “firewalls” and “vertical integration” as idealized implementations of “fragmentation” and “integration” (respective). The problem, though, is that terms like these (as well as “disruption”) are insufficiently normative to guide large-scale health system change. They describe, but they do not adequately prescribe. By examining those instances where: a) regulation promotes innovation, and b) increasing (some kinds of) availability of data actually enhances security, confidentiality, and privacy protections, this article attempts to give a richer account of the ethics of fragmentation and integration in the U.S. health care system. But, it also has a darker side, highlighting the inevitable conflicts of values created in a “reputation society” driven by stigmatizing social sorting systems. Personal data control may exacerbate social inequalities. Data aggregation may increase both our powers of research and our vulnerability to breach. The health data policymaking landscape of the next decade will feature a series of intractable conflicts between these important social values

    Practices, policies, and problems in the management of learning data: A survey of libraries’ use of digital learning objects and the data they create

    Get PDF
    This study analyzed libraries’ management of the data generated by library digital learning objects (DLO’s) such as forms, surveys, quizzes, and tutorials. A substantial proportion of respondents reported having a policy relevant to learning data, typically a campus-level policy, but most did not. Other problems included a lack of access to library learning data, concerns about student privacy, inadequate granularity or standardization, and a lack of knowledge about colleagues’ practices. We propose more dialogue on learning data within libraries, between libraries and administrators, and across the library profession
    • 

    corecore