51,597 research outputs found

    Quantum surveillance and 'shared secrets'. A biometric step too far? CEPS Liberty and Security in Europe, July 2010

    Get PDF
    It is no longer sensible to regard biometrics as having neutral socio-economic, legal and political impacts. Newer generation biometrics are fluid and include behavioural and emotional data that can be combined with other data. Therefore, a range of issues needs to be reviewed in light of the increasing privatisation of ‘security’ that escapes effective, democratic parliamentary and regulatory control and oversight at national, international and EU levels, argues Juliet Lodge, Professor and co-Director of the Jean Monnet European Centre of Excellence at the University of Leeds, U

    Online privacy: towards informational self-determination on the internet : report from Dagstuhl Perspectives Workshop 11061

    Get PDF
    The Dagstuhl Perspectives Workshop "Online Privacy: Towards Informational Self-Determination on the Internet" (11061) has been held in February 6-11, 2011 at Schloss Dagstuhl. 30 participants from academia, public sector, and industry have identified the current status-of-the-art of and challenges for online privacy as well as derived recommendations for improving online privacy. Whereas the Dagstuhl Manifesto of this workshop concludes the results of the working groups and panel discussions, this article presents the talks of this workshop by their abstracts

    Privacy & law enforcement

    Get PDF

    Regulating Data as Property: A New Construct for Moving Forward

    Get PDF
    The global community urgently needs precise, clear rules that define ownership of data and express the attendant rights to license, transfer, use, modify, and destroy digital information assets. In response, this article proposes a new approach for regulating data as an entirely new class of property. Recently, European and Asian public officials and industries have called for data ownership principles to be developed, above and beyond current privacy and data protection laws. In addition, official policy guidances and legal proposals have been published that offer to accelerate realization of a property rights structure for digital information. But how can ownership of digital information be achieved? How can those rights be transferred and enforced? Those calls for data ownership emphasize the impact of ownership on the automotive industry and the vast quantities of operational data which smart automobiles and self-driving vehicles will produce. We looked at how, if at all, the issue was being considered in consumer-facing statements addressing the data being collected by their vehicles. To formulate our proposal, we also considered continued advances in scientific research, quantum mechanics, and quantum computing which confirm that information in any digital or electronic medium is, and always has been, physical, tangible matter. Yet, to date, data regulation has sought to adapt legal constructs for “intangible” intellectual property or to express a series of permissions and constraints tied to specific classifications of data (such as personally identifiable information). We examined legal reforms that were recently approved by the United Nations Commission on International Trade Law to enable transactions involving electronic transferable records, as well as prior reforms adopted in the United States Uniform Commercial Code and Federal law to enable similar transactions involving digital records that were, historically, physical assets (such as promissory notes or chattel paper). Finally, we surveyed prior academic scholarship in the U.S. and Europe to determine if the physical attributes of digital data had been previously considered in the vigorous debates on how to regulate personal information or the extent, if at all, that the solutions developed for transferable records had been considered for larger classes of digital assets. Based on the preceding, we propose that regulation of digital information assets, and clear concepts of ownership, can be built on existing legal constructs that have enabled electronic commercial practices. We propose a property rules construct that clearly defines a right to own digital information arises upon creation (whether by keystroke or machine), and suggest when and how that right attaches to specific data though the exercise of technological controls. This construct will enable faster, better adaptations of new rules for the ever-evolving portfolio of data assets being created around the world. This approach will also create more predictable, scalable, and extensible mechanisms for regulating data and is consistent with, and may improve the exercise and enforcement of, rights regarding personal information. We conclude by highlighting existing technologies and their potential to support this construct and begin an inventory of the steps necessary to further proceed with this process

    Big data for monitoring educational systems

    Get PDF
    This report considers “how advances in big data are likely to transform the context and methodology of monitoring educational systems within a long-term perspective (10-30 years) and impact the evidence based policy development in the sector”, big data are “large amounts of different types of data produced with high velocity from a high number of various types of sources.” Five independent experts were commissioned by Ecorys, responding to themes of: students' privacy, educational equity and efficiency, student tracking, assessment and skills. The experts were asked to consider the “macro perspective on governance on educational systems at all levels from primary, secondary education and tertiary – the latter covering all aspects of tertiary from further, to higher, and to VET”, prioritising primary and secondary levels of education

    Lex Informatica: The Formulation of Information Policy Rules through Technology

    Get PDF
    Historically, law and government regulation have established default rules for information policy, including constitutional rules on freedom of expression and statutory rights of ownership of information. This Article will show that for network environments and the Information Society, however, law and government regulation are not the only source of rule-making. Technological capabilities and system design choices impose rules on participants. The creation and implementation of information policy are embedded in network designs and standards as well as in system configurations. Even user preferences and technical choices create overarching, local default rules. This Article argues, in essence, that the set of rules for information flows imposed by technology and communication networks form a “Lex Informatica” that policymakers must understand, consciously recognize, and encourage

    Algorithmic Jim Crow

    Get PDF
    This Article contends that current immigration- and security-related vetting protocols risk promulgating an algorithmically driven form of Jim Crow. Under the “separate but equal” discrimination of a historic Jim Crow regime, state laws required mandatory separation and discrimination on the front end, while purportedly establishing equality on the back end. In contrast, an Algorithmic Jim Crow regime allows for “equal but separate” discrimination. Under Algorithmic Jim Crow, equal vetting and database screening of all citizens and noncitizens will make it appear that fairness and equality principles are preserved on the front end. Algorithmic Jim Crow, however, will enable discrimination on the back end in the form of designing, interpreting, and acting upon vetting and screening systems in ways that result in a disparate impact

    Privacy as a Public Good

    Get PDF
    Privacy is commonly studied as a private good: my personal data is mine to protect and control, and yours is yours. This conception of privacy misses an important component of the policy problem. An individual who is careless with data exposes not only extensive information about herself, but about others as well. The negative externalities imposed on nonconsenting outsiders by such carelessness can be productively studied in terms of welfare economics. If all relevant individuals maximize private benefit, and expect all other relevant individuals to do the same, neoclassical economic theory predicts that society will achieve a suboptimal level of privacy. This prediction holds even if all individuals cherish privacy with the same intensity. As the theoretical literature would have it, the struggle for privacy is destined to become a tragedy. But according to the experimental public-goods literature, there is hope. Like in real life, people in experiments cooperate in groups at rates well above those predicted by neoclassical theory. Groups can be aided in their struggle to produce public goods by institutions, such as communication, framing, or sanction. With these institutions, communities can manage public goods without heavy-handed government intervention. Legal scholarship has not fully engaged this problem in these terms. In this Article, we explain why privacy has aspects of a public good, and we draw lessons from both the theoretical and the empirical literature on public goods to inform the policy discourse on privacy
    corecore