86 research outputs found

    The European Approach to Privacy

    Get PDF
    This paper critically assesses the character of European (Union’s) privacy law and policy in the field of online media and electronic communications. Contrary to current understanding, this field of law is more fragmented and ill-developed than is often assumed, in particular by those discussing privacy law and policy in an international and transatlantic context. In fact, some of the most challenging regulatory issues in the field of online media and electronic communications still lack a well-developed common European approach and remain the subject of regulation at the level of the different member states of the European Union. Drawing on historic insights, the paper shows how EU policy making in the field of privacy and data protection is and remains strongly influenced by the EU institutional setting. In particular, the paper shows that the specific substantive outcome of European privacy law and policy is strongly influenced by and can only be understood properly through the lens of the ongoing project of European integration more generally. The paper will develop its main thesis by focusing on three important and current privacy issues and their treatment by EU lawmakers and the EU legal system. These are: (I.) the question of retention of communications meta-data (e.g. traffic and location data) in the field of electronic communications; (II.) the legal framework for liability of search engines for privacy and reputational harms in the online environment, including a 'right to be forgotten', and (III.) the question of the security of and the potential lawful access by foreign governments to data in the cloud. After discussing these substantive privacy policy issues and the legal frameworks that have developed (and are developing) to address them at the EU level, the paper will analyze these frameworks in view of the apparent interplay of the substance of privacy law and policy at the EU level on the one hand and the broader constitutional and institutional dynamics related to EU competency and integration. The paper starts with a discussion of the basic underlying motivations, rationales and competences for addressing privacy issues at the European level, which until recently were predominantly economic in nature. The implication of this is that some of the most pressing data privacy issues which are primarily non-economic in character, have been addressed at the fringes of what could be called the European approach to data privacy, in which the establishment of a functioning European internal market and the free flow of personal data under sufficient safeguards relating to data privacy are the dominant concerns. More recently, the adoption of the Lisbon treaty, the establishment of a binding right to data protection and privacy in the EU Charter and a new legal basis for the establishment of data protection rules at the EU level, EU privacy law and policy has become increasingly connected to the furtherance of the protection of privacy and data protection as fundamental rights more generally. Through the case studies in the paper, this dynamic of how policy rationales end up playing out at the EU level and inform the substance of privacy policies adopted, is illustrated in detail. In particular, the analysis shows how EU policy making tends to strive towards a common and comprehensive European approach, but typically fails to take account of some of the leading concerns, and is often simply not equipped or even allowed to include them in the process. For instance, there is significant disagreement about the weight that should be attributed to freedom of expression concerns in the online environment and the role of the EU with respect to media and the proper balancing of freedom and privacy in the media remains limited. With respect to national security concerns there are no European harmonization of national approaches at all. The result is that important policy concerns from the perspective of privacy in electronic communications end up being addressed indirectly, inefficiently and incompletely, through the European data privacy frameworks that may aspire to be comprehensive but would need significant reforms to achieve this aim. The article will discuss possible reforms but will warn against aspirations of further harmonization and unification of European Privacy Law. In the absence of fundamental institutional reform of the EU, further harmonization could end up being detrimental to other important policy goals currently addressed largely outside of the EU legal framework, including the issues of media freedom, criminal procedural justice and the protection of privacy and information security in relation to foreign intelligence agencies specifically discussed in this paper

    Intergenerational Community-Based Research and Creative Practice: Promoting Environmental Sustainability in Jinja, Uganda

    Get PDF
    This article critically reflects on the methodological approach developed for a recent project based in Jinja, Uganda, that sought to generate new forms of environmental knowledge and action utilizing diverse forms of creative intergenerational practice embedded within a broader framework of community-based participatory research. This approach provided new opportunities for intergenerational dialogue in Jinja, generated increased civic environmental engagement, and resulted in a participant-led campaign to share knowledge regarding sustainable biomass consumption. We term this approach intergenerational community-based research and creative practice. We discuss the advantages of this model while also reflecting throughout on the challenges of the approach

    Pre-Clinical Evaluation of a Replication-Competent Recombinant Adenovirus Serotype 4 Vaccine Expressing Influenza H5 Hemagglutinin

    Get PDF
    Influenza virus remains a significant health and social concern in part because of newly emerging strains, such as avian H5N1 virus. We have developed a prototype H5N1 vaccine using a recombinant, replication-competent Adenovirus serotype 4 (Ad4) vector, derived from the U.S. military Ad4 vaccine strain, to express the hemagglutinin (HA) gene from A/Vietnam/1194/2004 influenza virus (Ad4-H5-Vtn). Our hypothesis is that a mucosally-delivered replicating Ad4-H5-Vtn recombinant vector will be safe and induce protective immunity against H5N1 influenza virus infection and disease pathogenesis.The Ad4-H5-Vtn vaccine was designed with a partial deletion of the E3 region of Ad4 to accommodate the influenza HA gene. Replication and growth kinetics of the vaccine virus in multiple human cell lines indicated that the vaccine virus is attenuated relative to the wild type virus. Expression of the HA transgene in infected cells was documented by flow cytometry, western blot analysis and induction of HA-specific antibody and cellular immune responses in mice. Of particular note, mice immunized intranasally with the Ad4-H5-Vtn vaccine were protected against lethal H5N1 reassortant viral challenge even in the presence of pre-existing immunity to the Ad4 wild type virus.Several non-clinical attributes of this vaccine including safety, induction of HA-specific humoral and cellular immunity, and efficacy were demonstrated using an animal model to support Phase 1 clinical trial evaluation of this new vaccine

    Data from a pre-publication independent replication initiative examining ten moral judgement effects

    Get PDF
    We present the data from a crowdsourced project seeking to replicate findings in independent laboratories before (rather than after) they are published. In this Pre-Publication Independent Replication (PPIR) initiative, 25 research groups attempted to replicate 10 moral judgment effects from a single laboratory's research pipeline of unpublished findings. The 10 effects were investigated using online/lab surveys containing psychological manipulations (vignettes) followed by questionnaires. Results revealed a mix of reliable, unreliable, and culturally moderated findings. Unlike any previous replication project, this dataset includes the data from not only the replications but also from the original studies, creating a unique corpus that researchers can use to better understand reproducibility and irreproducibility in science

    Overcoming leakage in scalable quantum error correction

    Full text link
    Leakage of quantum information out of computational states into higher energy states represents a major challenge in the pursuit of quantum error correction (QEC). In a QEC circuit, leakage builds over time and spreads through multi-qubit interactions. This leads to correlated errors that degrade the exponential suppression of logical error with scale, challenging the feasibility of QEC as a path towards fault-tolerant quantum computation. Here, we demonstrate the execution of a distance-3 surface code and distance-21 bit-flip code on a Sycamore quantum processor where leakage is removed from all qubits in each cycle. This shortens the lifetime of leakage and curtails its ability to spread and induce correlated errors. We report a ten-fold reduction in steady-state leakage population on the data qubits encoding the logical state and an average leakage population of less than 1×1031 \times 10^{-3} throughout the entire device. The leakage removal process itself efficiently returns leakage population back to the computational basis, and adding it to a code circuit prevents leakage from inducing correlated error across cycles, restoring a fundamental assumption of QEC. With this demonstration that leakage can be contained, we resolve a key challenge for practical QEC at scale.Comment: Main text: 7 pages, 5 figure

    The pipeline project: Pre-publication independent replications of a single laboratory's research pipeline

    Get PDF
    This crowdsourced project introduces a collaborative approach to improving the reproducibility of scientific research, in which findings are replicated in qualified independent laboratories before (rather than after) they are published. Our goal is to establish a non-adversarial replication process with highly informative final results. To illustrate the Pre-Publication Independent Replication (PPIR) approach, 25 research groups conducted replications of all ten moral judgment effects which the last author and his collaborators had “in the pipeline” as of August 2014. Six findings replicated according to all replication criteria, one finding replicated but with a significantly smaller effect size than the original, one finding replicated consistently in the original culture but not outside of it, and two findings failed to find support. In total, 40% of the original findings failed at least one major replication criterion. Potential ways to implement and incentivize pre-publication independent replication on a large scale are discussed

    Suppressing quantum errors by scaling a surface code logical qubit

    Full text link
    Practical quantum computing will require error rates that are well below what is achievable with physical qubits. Quantum error correction offers a path to algorithmically-relevant error rates by encoding logical qubits within many physical qubits, where increasing the number of physical qubits enhances protection against physical errors. However, introducing more qubits also increases the number of error sources, so the density of errors must be sufficiently low in order for logical performance to improve with increasing code size. Here, we report the measurement of logical qubit performance scaling across multiple code sizes, and demonstrate that our system of superconducting qubits has sufficient performance to overcome the additional errors from increasing qubit number. We find our distance-5 surface code logical qubit modestly outperforms an ensemble of distance-3 logical qubits on average, both in terms of logical error probability over 25 cycles and logical error per cycle (2.914%±0.016%2.914\%\pm 0.016\% compared to 3.028%±0.023%3.028\%\pm 0.023\%). To investigate damaging, low-probability error sources, we run a distance-25 repetition code and observe a 1.7×1061.7\times10^{-6} logical error per round floor set by a single high-energy event (1.6×1071.6\times10^{-7} when excluding this event). We are able to accurately model our experiment, and from this model we can extract error budgets that highlight the biggest challenges for future systems. These results mark the first experimental demonstration where quantum error correction begins to improve performance with increasing qubit number, illuminating the path to reaching the logical error rates required for computation.Comment: Main text: 6 pages, 4 figures. v2: Update author list, references, Fig. S12, Table I

    Measurement-induced entanglement and teleportation on a noisy quantum processor

    Full text link
    Measurement has a special role in quantum theory: by collapsing the wavefunction it can enable phenomena such as teleportation and thereby alter the "arrow of time" that constrains unitary evolution. When integrated in many-body dynamics, measurements can lead to emergent patterns of quantum information in space-time that go beyond established paradigms for characterizing phases, either in or out of equilibrium. On present-day NISQ processors, the experimental realization of this physics is challenging due to noise, hardware limitations, and the stochastic nature of quantum measurement. Here we address each of these experimental challenges and investigate measurement-induced quantum information phases on up to 70 superconducting qubits. By leveraging the interchangeability of space and time, we use a duality mapping, to avoid mid-circuit measurement and access different manifestations of the underlying phases -- from entanglement scaling to measurement-induced teleportation -- in a unified way. We obtain finite-size signatures of a phase transition with a decoding protocol that correlates the experimental measurement record with classical simulation data. The phases display sharply different sensitivity to noise, which we exploit to turn an inherent hardware limitation into a useful diagnostic. Our work demonstrates an approach to realize measurement-induced physics at scales that are at the limits of current NISQ processors

    Non-Abelian braiding of graph vertices in a superconducting processor

    Full text link
    Indistinguishability of particles is a fundamental principle of quantum mechanics. For all elementary and quasiparticles observed to date - including fermions, bosons, and Abelian anyons - this principle guarantees that the braiding of identical particles leaves the system unchanged. However, in two spatial dimensions, an intriguing possibility exists: braiding of non-Abelian anyons causes rotations in a space of topologically degenerate wavefunctions. Hence, it can change the observables of the system without violating the principle of indistinguishability. Despite the well developed mathematical description of non-Abelian anyons and numerous theoretical proposals, the experimental observation of their exchange statistics has remained elusive for decades. Controllable many-body quantum states generated on quantum processors offer another path for exploring these fundamental phenomena. While efforts on conventional solid-state platforms typically involve Hamiltonian dynamics of quasi-particles, superconducting quantum processors allow for directly manipulating the many-body wavefunction via unitary gates. Building on predictions that stabilizer codes can host projective non-Abelian Ising anyons, we implement a generalized stabilizer code and unitary protocol to create and braid them. This allows us to experimentally verify the fusion rules of the anyons and braid them to realize their statistics. We then study the prospect of employing the anyons for quantum computation and utilize braiding to create an entangled state of anyons encoding three logical qubits. Our work provides new insights about non-Abelian braiding and - through the future inclusion of error correction to achieve topological protection - could open a path toward fault-tolerant quantum computing
    corecore