698 research outputs found

    Big Data for All: Privacy and User Control in the Age of Analytics

    Get PDF
    We live in an age of “big data.” Data have become the raw material of production, a new source for immense economic and social value. Advances in data mining and analytics and the massive increase in computing power and data storage capacity have expanded by orders of magnitude the scope of information available for businesses and government. Data are now available for analysis in raw form, escaping the confines of structured databases and enhancing researchers’ abilities to identify correlations and conceive of new, unanticipated uses for existing information. In addition, the increasing number of people, devices, and sensors that are now connected by digital networks has revolutionized the ability to generate, communicate, share, and access data. Data creates enormous value for the world economy, driving innovation, productivity, efficiency, and growth. At the same time, the “data deluge” presents privacy concerns which could stir a regulatory backlash dampening the data economy and stifling innovation. In order to craft a balance between beneficial uses of data and individual privacy, policymakers must address some of the most fundamental concepts of privacy law, including the definition of “personally identifiable information,” the role of individual control, and the principles of data minimization and purpose limitation. This article emphasizes the importance of providing individuals with access to their data in usable format. This will let individuals share the wealth created by their information and incentivize developers to offer user-side features and applications harnessing the value of big data. Where individual access to data is impracticable, data are likely to be de-identified to an extent sufficient to diminish privacy concerns. In addition, since in a big data world it is often not the data but rather the inferences drawn from them that give cause for concern, organizations should be required to disclose their decisional criteria

    Big Data for All: Privacy and User Control in the Age of Analytics

    Get PDF
    We live in an age of “big data.” Data have become the raw material of production, a new source for immense economic and social value. Advances in data mining and analytics and the massive increase in computing power and data storage capacity have expanded by orders of magnitude the scope of information available for businesses and government. Data are now available for analysis in raw form, escaping the confines of structured databases and enhancing researchers’ abilities to identify correlations and conceive of new, unanticipated uses for existing information. In addition, the increasing number of people, devices, and sensors that are now connected by digital networks has revolutionized the ability to generate, communicate, share, and access data. Data creates enormous value for the world economy, driving innovation, productivity, efficiency, and growth. At the same time, the “data deluge” presents privacy concerns which could stir a regulatory backlash dampening the data economy and stifling innovation. In order to craft a balance between beneficial uses of data and individual privacy, policymakers must address some of the most fundamental concepts of privacy law, including the definition of “personally identifiable information,” the role of individual control, and the principles of data minimization and purpose limitation. This article emphasizes the importance of providing individuals with access to their data in usable format. This will let individuals share the wealth created by their information and incentivize developers to offer user-side features and applications harnessing the value of big data. Where individual access to data is impracticable, data are likely to be deidentified to an extent sufficient to diminish privacy concerns. In addition, since in a big data world it is often not the data but rather the inferences drawn from them that give cause for concern, organizations should be required to disclose their decisional criteria

    Big Data\u27s Other Privacy Problem

    Get PDF
    Big Data has not one privacy problem, but two. We are accustomed to talking about surveillance of data subjects. But Big Data also enables disconcertingly close surveillance of its users. The questions we ask of Big Data can be intensely revealing, but, paradoxically, protecting subjects\u27 privacy can require spying on users. Big Data is an ideology of technology, used to justify the centralization of information and power in data barons, pushing both subjects and users into a kind of feudal subordination. This short and polemical essay uses the Bloomberg Terminal scandal as a window to illuminate Big Data\u27s other privacy problem

    A realisation of ethical concerns with smartphone personal health monitoring apps

    Get PDF
    The pervasiveness of smartphones has facilitated a new way in which owners of devices can monitor their health using applications (apps) that are installed on their smartphones. Smartphone personal health monitoring (SPHM) collects and stores health related data of the user either locally or in a third party storing mechanism. They are also capable of giving feedback to the user of the app in response to conditions are provided to the app therefore empowering the user to actively make decisions to adjust their lifestyle. Regardless of the benefits that this new innovative technology offers to its users, there are some ethical concerns to the user of SPHM apps. These ethical concerns are in some way connected to the features of SPHM apps. From a literature survey, this paper attempts to recognize ethical issues with personal health monitoring apps on smartphones, viewed in light of general ethics of ubiquitous computing. The paper argues that there are ethical concerns with the use of SPHM apps regardless of the benefits that the technology offers to users due to SPHM apps’ ubiquity leaving them open to known and emerging ethical concerns. The paper then propose a need further empirical research to validate the claim

    The Complexities of Developing a Personal Code of Ethics for Learning Analytics Practitioners: Implications for Institutions and the Field

    Get PDF
    In this paper we explore the potential role, value and utility of a personal code of ethics (COE) for learning analytics practitioners, and in particular we consider whether such a COE might usefully mediate individual actions and choices in relation to a more abstract institutional COE. While several institutional COEs now exist, little attention has been paid to detailing the ethical responsibilities of individual practitioners. To investigate the problems associated with developing and implementing a personal COE, we drafted an LA Practitioner COE based on other professional codes, and invited feedback from a range of learning analytics stakeholders and practitioners: ethicists, students, researchers and technology executives. Three main themes emerged from their reflections: 1. A need to balance real world demands with abstract principles, 2. The limits to individual accountability within the learning analytics space, and 3. The continuing value of debate around an aspirational code of ethics within the field of learning analytics

    Privacy Attitudes among Early Adopters of Emerging Health Technologies.

    Get PDF
    IntroductionAdvances in health technology such as genome sequencing and wearable sensors now allow for the collection of highly granular personal health data from individuals. It is unclear how people think about privacy in the context of these emerging health technologies. An open question is whether early adopters of these advances conceptualize privacy in different ways than non-early adopters.PurposeThis study sought to understand privacy attitudes of early adopters of emerging health technologies.MethodsTranscripts from in-depth, semi-structured interviews with early adopters of genome sequencing and health devices and apps were analyzed with a focus on participant attitudes and perceptions of privacy. Themes were extracted using inductive content analysis.ResultsAlthough interviewees were willing to share personal data to support scientific advancements, they still expressed concerns, as well as uncertainty about who has access to their data, and for what purpose. In short, they were not dismissive of privacy risks. Key privacy-related findings are organized into four themes as follows: first, personal data privacy; second, control over personal information; third, concerns about discrimination; and fourth, contributing personal data to science.ConclusionEarly adopters of emerging health technologies appear to have more complex and nuanced conceptions of privacy than might be expected based on their adoption of personal health technologies and participation in open science. Early adopters also voiced uncertainty about the privacy implications of their decisions to use new technologies and share their data for research. Though not representative of the general public, studies of early adopters can provide important insights into evolving attitudes toward privacy in the context of emerging health technologies and personal health data research

    New Business Models for Think Tanks

    Get PDF

    Big Data Privacy Context: Literature Effects On Secure Informational Assets

    Get PDF
    This article's objective is the identification of research opportunities in the current big data privacy domain, evaluating literature effects on secure informational assets. Until now, no study has analyzed such relation. Its results can foster science, technologies and businesses. To achieve these objectives, a big data privacy Systematic Literature Review (SLR) is performed on the main scientific peer reviewed journals in Scopus database. Bibliometrics and text mining analysis complement the SLR. This study provides support to big data privacy researchers on: most and least researched themes, research novelty, most cited works and authors, themes evolution through time and many others. In addition, TOPSIS and VIKOR ranks were developed to evaluate literature effects versus informational assets indicators. Secure Internet Servers (SIS) was chosen as decision criteria. Results show that big data privacy literature is strongly focused on computational aspects. However, individuals, societies, organizations and governments face a technological change that has just started to be investigated, with growing concerns on law and regulation aspects. TOPSIS and VIKOR Ranks differed in several positions and the only consistent country between literature and SIS adoption is the United States. Countries in the lowest ranking positions represent future research opportunities.Comment: 21 pages, 9 figure
    • …
    corecore