21 research outputs found

    Who is Reading Whom Now: Privacy in Education from Books to MOOCs

    Get PDF
    This Article is the most comprehensive study to date of the policy issues and privacy concerns arising from the surge of ed tech innovation. It surveys the burgeoning market of ed tech solutions, which range from free Android and iPhone apps to comprehensive learning management systems and digitized curricula delivered via the Internet. It discusses the deployment of big data analytics by education institutions to enhance student performance, evaluate teachers, improve education techniques, customize programs, and better leverage scarce resources to optimize education results. This Article seeks to untangle ed tech privacy concerns from the broader policy debates surrounding standardization, the Common Core, longitudinal data systems, and the role of business in education. It unpacks the meaning of commercial data uses in schools, distinguishing between behavioral advertising to children and providing comprehensive, optimized education solutions to students, teachers, and school systems. It addresses privacy problems related to small data --the individualization enabled by optimization solutions that read students even as they read their books-as well as concerns about big data analysis and measurement, including algorithmic biases, discreet discrimination, narrowcasting, and chilling effects. This Article proposes solutions ranging from deployment of traditional privacy tools, such as contractual and organizational governance mechanisms, to greater data literacy by teachers and parental involvement. It advocates innovative technological solutions, including converting student data to a parent-accessible feature and enhancing algorithmic transparency to shed light on the inner working of the machine. For example, individually curated data backpacks would empower students and their parents by providing them with comprehensive portable profiles to facilitate personalized learning regardless of where they go. This Article builds on a methodology developed in the authors\u27 previous work to balance big data rewards against privacy risks, while complying with several layers of federal and state regulation

    Big Data for All: Privacy and User Control in the Age of Analytics

    Get PDF
    We live in an age of “big data.” Data have become the raw material of production, a new source for immense economic and social value. Advances in data mining and analytics and the massive increase in computing power and data storage capacity have expanded by orders of magnitude the scope of information available for businesses and government. Data are now available for analysis in raw form, escaping the confines of structured databases and enhancing researchers’ abilities to identify correlations and conceive of new, unanticipated uses for existing information. In addition, the increasing number of people, devices, and sensors that are now connected by digital networks has revolutionized the ability to generate, communicate, share, and access data. Data creates enormous value for the world economy, driving innovation, productivity, efficiency, and growth. At the same time, the “data deluge” presents privacy concerns which could stir a regulatory backlash dampening the data economy and stifling innovation. In order to craft a balance between beneficial uses of data and individual privacy, policymakers must address some of the most fundamental concepts of privacy law, including the definition of “personally identifiable information,” the role of individual control, and the principles of data minimization and purpose limitation. This article emphasizes the importance of providing individuals with access to their data in usable format. This will let individuals share the wealth created by their information and incentivize developers to offer user-side features and applications harnessing the value of big data. Where individual access to data is impracticable, data are likely to be de-identified to an extent sufficient to diminish privacy concerns. In addition, since in a big data world it is often not the data but rather the inferences drawn from them that give cause for concern, organizations should be required to disclose their decisional criteria

    Beyond IRBs: Ethical Guidelines for Data Research

    Full text link

    Big Data for All: Privacy and User Control in the Age of Analytics

    Get PDF
    We live in an age of “big data.” Data have become the raw material of production, a new source for immense economic and social value. Advances in data mining and analytics and the massive increase in computing power and data storage capacity have expanded by orders of magnitude the scope of information available for businesses and government. Data are now available for analysis in raw form, escaping the confines of structured databases and enhancing researchers’ abilities to identify correlations and conceive of new, unanticipated uses for existing information. In addition, the increasing number of people, devices, and sensors that are now connected by digital networks has revolutionized the ability to generate, communicate, share, and access data. Data creates enormous value for the world economy, driving innovation, productivity, efficiency, and growth. At the same time, the “data deluge” presents privacy concerns which could stir a regulatory backlash dampening the data economy and stifling innovation. In order to craft a balance between beneficial uses of data and individual privacy, policymakers must address some of the most fundamental concepts of privacy law, including the definition of “personally identifiable information,” the role of individual control, and the principles of data minimization and purpose limitation. This article emphasizes the importance of providing individuals with access to their data in usable format. This will let individuals share the wealth created by their information and incentivize developers to offer user-side features and applications harnessing the value of big data. Where individual access to data is impracticable, data are likely to be deidentified to an extent sufficient to diminish privacy concerns. In addition, since in a big data world it is often not the data but rather the inferences drawn from them that give cause for concern, organizations should be required to disclose their decisional criteria

    Big Data for All: Privacy and User Control in the Age of Analytics

    Get PDF
    We live in an age of “big data.” Data have become the raw material of production, a new source for immense economic and social value. Advances in data mining and analytics and the massive increase in computing power and data storage capacity have expanded by orders of magnitude the scope of information available for businesses and government. Data are now available for analysis in raw form, escaping the confines of structured databases and enhancing researchers’ abilities to identify correlations and conceive of new, unanticipated uses for existing information. In addition, the increasing number of people, devices, and sensors that are now connected by digital networks has revolutionized the ability to generate, communicate, share, and access data. Data creates enormous value for the world economy, driving innovation, productivity, efficiency, and growth. At the same time, the “data deluge” presents privacy concerns which could stir a regulatory backlash dampening the data economy and stifling innovation. In order to craft a balance between beneficial uses of data and individual privacy, policymakers must address some of the most fundamental concepts of privacy law, including the definition of “personally identifiable information,” the role of individual control, and the principles of data minimization and purpose limitation. This article emphasizes the importance of providing individuals with access to their data in usable format. This will let individuals share the wealth created by their information and incentivize developers to offer user-side features and applications harnessing the value of big data. Where individual access to data is impracticable, data are likely to be de-identified to an extent sufficient to diminish privacy concerns. In addition, since in a big data world it is often not the data but rather the inferences drawn from them that give cause for concern, organizations should be required to disclose their decisional criteria

    Taming The Golem: Challenges of Ethical Algorithmic Decision-Making

    No full text

    IRIE International Review of Information Ethics Vol. 21 (07/2014) The Ethics of Student Privacy: Building Trust for Ed Tech

    No full text
    Abstract: This article analyzes the opportunities and risks of data driven education technologies (ed tech). It discusses the deployment of data technologies by education institutions to enhance student performance, evaluate teachers, improve education techniques, customize programs, devise financial assistance plans, and better leverage scarce resources to assess and optimize education results. Critics fear ed tech could introduce new risks of privacy infringements, narrowcasting and discrimination, fueling the stratification of society by channeling "winners" to a "Harvard track" and "losers" to a "bluer collar" track; and overly limit the right to fail, struggle and learn through experimentation. The article argues that together with teachers, parents and students, schools and vendors must establish a trust framework to facilitate the adoption of data driven ed tech. Enhanced transparency around institutions' data use philosophy and ethical guidelines, and novel methods of data "featurization," will achieve far more than formalistic notices and contractual legalese
    corecore