99,233 research outputs found

    An Empirical Investigation of Internet Privacy: Customer Behaviour, Companies’ Privacy Policy Disclosures, and a Gap

    Get PDF
    Privacy emerges as a critical issue in an e-commerce environment because of a fundamental tension among corporate, consumer, and government interests. By reviewing prior Internet-privacy research in the fields of information systems, business, and marketing published between 1995 and 2006, we consider the following research questions: 1) how an individual’s privacy behaviour is affected by privacy policy disclosures and by the level of the individual’s involvement regarding the sensitivity of personal information; 2) how companies’ privacy policies vary with respect to regulatory approaches and cultural values; and 3) whether there is a gap between the privacy practices valued by individuals and those emphasized by companies. A three-stage study is conducted to answer these questions. The first two stages, consisting of a Web-based survey and an online ordering experiment with 210 participants, found that individuals are more likely to read the privacy policy statements posted on Web sites and less likely to provide personal information, when they are under a high privacy involved situation as compared to being in a low privacy involved situation. However, the existence of a privacy seal did not affect individuals’ behaviour, regardless of involvement conditions. This study also found a gap between self-reported privacy behaviour and actual privacy behaviour. When individuals were requested to provide personal information, their privacy policy statement reading behaviour was close to their self-report behaviour. However, their personal information providing behaviour was different from their self-reported behaviour. The third stage, which entailed the study of 420 privacy policies spanning six countries and two industries, showed that privacy policies vary across countries, as well as with varying governmental involvement and cultural values in those countries. Finally, the analysis of all the three stages revealed a gap between individuals’ importance ratings of companies’ privacy practices and policies that companies emphasize in their privacy disclosures

    Protecting Information Privacy

    Get PDF
    This report for the Equality and Human Rights Commission (the Commission) examines the threats to information privacy that have emerged in recent years, focusing on the activities of the state. It argues that current privacy laws and regulation do not adequately uphold human rights, and that fundamental reform is required. It identifies two principal areas of concern: the state’s handling of personal data, and the use of surveillance by public bodies. The central finding of this report is that the existing approach to the protection of information privacy in the UK is fundamentally flawed, and that there is a pressing need for widespread legislative reform in order to ensure that the rights contained in Article 8 are respected. The report argues for the establishment of a number of key ‘privacy principles’ that can be used to guide future legal reforms and the development of sector-specific regulation. The right to privacy is at risk of being eroded by the growing demand for information by government and the private sector. Unless we start to reform the law and build a regulatory system capable of protecting information privacy, we may soon find that it is a thing of the past

    Online Personal Data Processing and EU Data Protection Reform. CEPS Task Force Report, April 2013

    Get PDF
    This report sheds light on the fundamental questions and underlying tensions between current policy objectives, compliance strategies and global trends in online personal data processing, assessing the existing and future framework in terms of effective regulation and public policy. Based on the discussions among the members of the CEPS Digital Forum and independent research carried out by the rapporteurs, policy conclusions are derived with the aim of making EU data protection policy more fit for purpose in today’s online technological context. This report constructively engages with the EU data protection framework, but does not provide a textual analysis of the EU data protection reform proposal as such

    Privacy, Public Goods, and the Tragedy of the Trust Commons: A Response to Professors Fairfield and Engel

    Get PDF
    User trust is an essential resource for the information economy. Without it, users would not provide their personal information and digital businesses could not operate. Digital companies do not protect this trust sufficiently. Instead, many take advantage of it for short-term gain. They act in ways that, over time, will undermine user trust. In so doing, they act against their own best interest. This Article shows that companies behave this way because they face a tragedy of the commons. When a company takes advantage of user trust for profit, it appropriates the full benefit of this action. However, it shares the cost with all other companies that rely on the wellspring of user trust. Each company, acting rationally, has an incentive to appropriate as much of the trust resource as it can. That is why such companies collect, analyze, and “monetize” our personal information in such an unrestrained way. This behavior poses a longer term risk. User trust is like a fishery. It can withstand a certain level of exploitation and renew itself. But over-exploitation can cause it to collapse. Were digital companies collectively to undermine user trust this would not only hurt the users, it would damage the companies themselves. This Article explores commons-management theory for potential solutions to this impending tragedy of the trust commons

    PRIMA — Privacy research through the perspective of a multidisciplinary mash up

    Get PDF
    Based on a summary description of privacy protection research within three fields of inquiry, viz. social sciences, legal science, and computer and systems sciences, we discuss multidisciplinary approaches with regard to the difficulties and the risks that they entail as well as their possible advantages. The latter include the identification of relevant perspectives of privacy, increased expressiveness in the formulation of research goals, opportunities for improved research methods, and a boost in the utility of invested research efforts

    Configuring the Networked Citizen

    Get PDF
    Among legal scholars of technology, it has become commonplace to acknowledge that the design of networked information technologies has regulatory effects. For the most part, that discussion has been structured by the taxonomy developed by Lawrence Lessig, which classifies code as one of four principal regulatory modalities, alongside law, markets, and norms. As a result of that framing, questions about the applicability of constitutional protections to technical decisions have taken center stage in legal and policy debates. Some scholars have pondered whether digital architectures unacceptably constrain fundamental liberties, and what public design obligations might follow from such a conclusion. Others have argued that code belongs firmly on the private side of the public/private divide because it originates in the innovative activity of private actors. In a forthcoming book, the author argues that the project of situating code within one or another part of the familiar constitutional landscape too often distracts legal scholars from more important questions about the quality of the regulation that networked digital architectures produce. The gradual, inexorable embedding of networked information technologies has the potential to alter, in largely invisible ways, the interrelated processes of subject formation and culture formation. Within legal scholarship, the prevailing conceptions of subjectivity tend to be highly individualistic, oriented around the activities of speech and voluntary affiliation. Subjectivity also tends to be understood as definitionally independent of culture. Yet subjectivity is importantly collective, formed by the substrate within which individuality emerges. People form their conceptions of the good in part by reading, listening, and watching—by engaging with the products of a common culture—and by interacting with one another. Those activities are socially and culturally mediated, shaped by the preexisting communities into which individuals are born and within which they develop. They are also technically mediated, shaped by the artifacts that individuals encounter in common use. The social and cultural patterns that mediate the activities of self-constitution are being reconfigured by the pervasive adoption of technical protocols and services that manage the activities of content delivery, search, and social interaction. In developed countries, a broad cross-section of the population routinely uses networked information technologies and communications devices in hundreds of mundane, unremarkable ways. We search for information, communicate with each other, and gain access to networked resources and services. For the most part, as long as our devices and technologies work as expected, we give little thought to how they work; those questions are understood to be technical questions. Such questions are better characterized as sociotechnical. As networked digital architectures increasingly mediate the ordinary processes of everyday life, they catalyze gradual yet fundamental social and cultural change. This chapter—originally published in Imagining New Legalities: Privacy and Its Possibilities in the 21st Century, edited by Austin Sarat, Lawrence Douglas, and Martha Merrill Umphrey (2012)—considers two interrelated questions that flow from understanding sociotechnical change as (re)configuring networked subjects. First, it revisits the way that legal and policy debates locate networked information technologies with respect to the public/private divide. The design of networked information technologies and communications devices is conventionally treated as a private matter; indeed, that designation has been the principal stumbling block encountered by constitutional theorists of technology. The classification of code as presumptively private has effects that reach beyond debates about the scope of constitutional guarantees, shaping views about the extent to which regulation of technical design decisions is normatively desirable. This chapter reexamines that discursive process, using lenses supplied by literatures on third-party liability and governance. Second, this chapter considers the relationship between sociotechnical change and understandings of citizenship. The ways that people think, form beliefs, and interact with one another are centrally relevant to the sorts of citizens that they become. The gradual embedding of networked information technologies into the practice of everyday life therefore has important implications for both the meaning and the practice of citizenship in the emerging networked information society. If design decisions are neither merely technical nor presumptively private, then they should be subject to more careful scrutiny with regard to the kind of citizen they produce. In particular, policy-makers cannot avoid engaging with the particular values that are encoded

    The promotion of data sharing in pharmacoepidemiology

    Get PDF
    This article addresses the role of pharmacoepidemiology in patient safety and the crucial role of data sharing in ensuring that such activities occur. Against the backdrop of proposed reforms of European data protection legislation, it considers whether the current legislative landscape adequately facilitates this essential data sharing. It is argued that rather than maximising and promoting the benefits of such activities by facilitating data sharing, current and proposed legislative landscapes hamper these vital activities. The article posits that current and proposed data protection approaches to pharmacoepidemiology — and more broadly, re-uses of data — should be reoriented towards enabling these important safety enhancing activities. Two potential solutions are offered: 1) a dedicated working party on data reuse for health research and 2) the introduction of new, dedicated legislation

    Lex Informatica: The Formulation of Information Policy Rules through Technology

    Get PDF
    Historically, law and government regulation have established default rules for information policy, including constitutional rules on freedom of expression and statutory rights of ownership of information. This Article will show that for network environments and the Information Society, however, law and government regulation are not the only source of rule-making. Technological capabilities and system design choices impose rules on participants. The creation and implementation of information policy are embedded in network designs and standards as well as in system configurations. Even user preferences and technical choices create overarching, local default rules. This Article argues, in essence, that the set of rules for information flows imposed by technology and communication networks form a “Lex Informatica” that policymakers must understand, consciously recognize, and encourage
    • 

    corecore