128 research outputs found

    Grover's Quantum Search Algorithm for an Arbitrary Initial Mixed State

    Full text link
    The Grover quantum search algorithm is generalized to deal with an arbitrary mixed initial state. The probability to measure a marked state as a function of time is calculated, and found to depend strongly on the specific initial state. The form of the function, though, remains as it is in the case of initial pure state. We study the role of the von Neumann entropy of the initial state, and show that the entropy cannot be a measure for the usefulness of the algorithm. We give few examples and show that for some extremely mixed initial states carrying high entropy, the generalized Grover algorithm is considerably faster than any classical algorithm.Comment: 4 pages. See http://www.cs.technion.ac.il/~danken/MSc-thesis.pdf for extended discussio

    Private quantum decoupling and secure disposal of information

    Full text link
    Given a bipartite system, correlations between its subsystems can be understood as information that each one carries about the other. In order to give a model-independent description of secure information disposal, we propose the paradigm of private quantum decoupling, corresponding to locally reducing correlations in a given bipartite quantum state without transferring them to the environment. In this framework, the concept of private local randomness naturally arises as a resource, and total correlations get divided into eliminable and ineliminable ones. We prove upper and lower bounds on the amount of ineliminable correlations present in an arbitrary bipartite state, and show that, in tripartite pure states, ineliminable correlations satisfy a monogamy constraint, making apparent their quantum nature. A relation with entanglement theory is provided by showing that ineliminable correlations constitute an entanglement parameter. In the limit of infinitely many copies of the initial state provided, we compute the regularized ineliminable correlations to be measured by the coherent information, which is thus equipped with a new operational interpretation. In particular, our results imply that two subsystems can be privately decoupled if their joint state is separable.Comment: Child of 0807.3594 v2: minor changes v3: presentation improved, one figure added v4: extended version with a lot of discussions and examples v5: published versio

    Experimental investigation of classical and quantum correlations under decoherence

    Full text link
    It is well known that many operations in quantum information processing depend largely on a special kind of quantum correlation, that is, entanglement. However, there are also quantum tasks that display the quantum advantage without entanglement. Distinguishing classical and quantum correlations in quantum systems is therefore of both fundamental and practical importance. In consideration of the unavoidable interaction between correlated systems and the environment, understanding the dynamics of correlations would stimulate great interest. In this study, we investigate the dynamics of different kinds of bipartite correlations in an all-optical experimental setup. The sudden change in behaviour in the decay rates of correlations and their immunity against certain decoherences are shown. Moreover, quantum correlation is observed to be larger than classical correlation, which disproves the early conjecture that classical correlation is always greater than quantum correlation. Our observations may be important for quantum information processing.Comment: 7 pages, 4 figures, to appear in Nature Communication

    Childhood exposure due to the Chernobyl accident and thyroid cancer risk in contaminated areas of Belarus and Russia

    Get PDF
    The thyroid dose due to 131I releases during the Chernobyl accident was reconstructed for children and adolescents in two cities and 2122 settlements in Belarus, and in one city and 607 settlements in the Bryansk district of the Russian Federation. In this area, which covers the two high contamination spots in the two countries following the accident, data on thyroid cancer incidence during the period 1991-1995 were analysed in the light of possible increased thyroid surveillance. Two methods of risk analysis were applied: Poisson regression with results for the single settlements and Monte Carlo (MC) calculations for results in larger areas or sub-populations. Best estimates of both methods agreed well. Poisson regression estimates of 95% confidence intervals (CIs) were considerably smaller than the MC results, which allow for extra-Poisson uncertainties due to reconstructed doses and the background thyroid cancer incidence. The excess absolute risk per unit thyroid dose (EARPD) for the birth cohort 1971-1985 by the MC analysis was 2.1 (95% CI 1.0-4.5) cases per 10(4) person-year Gy. The point estimate is lower by a factor of two than that observed in a pooled study of thyroid cancer risk after external exposures. The excess relative risk per unit thyroid dose was 23 (95% CI 8.6-82) Gy(-1). No significant differences between countries or cities and rural areas were found. In the lowest dose group of the settlements with an average thyroid dose of 0.05 Gy the risk was statistically significantly elevated. Dependencies of risks on age-at-exposure and on gender are consistent with findings after external exposures

    The role of input materials in shallow seismogenic slip and forearc plateau development: International Ocean Discovery Program Expedition 362 Preliminary Report Sumatra Seismogenic Zone

    Get PDF
    Drilling the input materials of the north Sumatran subduction zone, part of the 5000 km long Sunda subduction zone system and the origin of the Mw ∼9.2 earthquake and tsunami that devastated coastal communities around the Indian Ocean in 2004, was designed to groundtruth the material properties causing unexpectedly shallow seismogenic slip and a distinctive forearc prism structure. The intriguing seismogenic behavior and forearc structure are not well explained by existing models or by relationships observed at margins where seismogenic slip typically occurs farther landward. The input materials of the north Sumatran subduction zone are a distinctively thick (as thick as 4-5 km) succession of primarily Bengal-Nicobar Fan-related sediments. The correspondence between the 2004 rupture location and the overlying prism plateau, as well as evidence for a strengthened input section, suggest the input materials are key to driving the distinctive slip behavior and long-term forearc structure. During Expedition 362, two sites on the Indian oceanic plate ∼250 km southwest of the subduction zone, Sites U1480 and U1481, were drilled, cored, and logged to a maximum depth of 1500 meters below seafloor. The succession of sediment/rocks that will develop into the plate boundary detachment and will drive growth of the forearc were sampled, and their progressive mechanical, frictional, and hydrogeological property evolution will be analyzed through postcruise experimental and modeling studies. Large penetration depths with good core recovery and successful wireline logging in the challenging submarine fan materials will enable evaluation of the role of thick sedimentar y subduction zone input sections in driving shallow slip and amplifying earthquake and tsunami magnitudes, at the Sunda subduction zone and globally at other subduction zones where submarine fan-influenced sections are being subducted

    Assistive technologies to address capabilities of people with dementia: from research to practice

    Get PDF
    Assistive technologies (AT) became pervasive and virtually present in all our life domains. They can be either an enabler or an obstacle leading to social exclusion. The Fondation Médéric Alzheimer gathered international experts of dementia care, with backgrounds in biomedical, human and social sciences, to analyse how AT can address the capabilities of people with dementia, on the basis of their needs. Discussion covered the unmet needs of people with dementia, the domains of daily life activities where AT can provide help to people with dementia, the enabling and empowering impact of technology to improve their safety and wellbeing, barriers and limits of use, technology assessment, ethical and legal issues. The capability approach (possible freedom) appears particularly relevant in person-centered dementia care and technology development. The focus is not on the solution, rather on what the person can do with it: seeing dementia as disability, with technology as an enabler to promote capabilities of the person, provides a useful framework for both research and practice. This article summarizes how these concepts took momentum in professional practice and public policies in the past fifteen years (2000-2015), discusses current issues in the design, development and economic model of AT for people with dementia, and covers how these technologies are being used and assessed

    Large scale variation in the rate of germ-line de novo mutation, base composition, divergence and diversity in humans

    Get PDF
    It has long been suspected that the rate of mutation varies across the human genome at a large scale based on the divergence between humans and other species. However, it is now possible to directly investigate this question using the large number of de novo mutations (DNMs) that have been discovered in humans through the sequencing of trios. We investi- gate a number of questions pertaining to the distribution of mutations using more than 130,000 DNMs from three large datasets. We demonstrate that the amount and pattern of variation differs between datasets at the 1MB and 100KB scales probably as a consequence of differences in sequencing technology and processing. In particular, datasets show differ- ent patterns of correlation to genomic variables such as replication time. Never-the-less there are many commonalities between datasets, which likely represent true patterns. We show that there is variation in the mutation rate at the 100KB, 1MB and 10MB scale that can- not be explained by variation at smaller scales, however the level of this variation is modest at large scales–at the 1MB scale we infer that ~90% of regions have a mutation rate within 50% of the mean. Different types of mutation show similar levels of variation and appear to vary in concert which suggests the pattern of mutation is relatively constant across the genome. We demonstrate that variation in the mutation rate does not generate large-scale variation in GC-content, and hence that mutation bias does not maintain the isochore struc- ture of the human genome. We find that genomic features explain less than 40% of the explainable variance in the rate of DNM. As expected the rate of divergence between spe- cies is correlated to the rate of DNM. However, the correlations are weaker than expected if all the variation in divergence was due to variation in the mutation rate. We provide evidence that this is due the effect of biased gene conversion on the probability that a mutation will become fixed. In contrast to divergence, we find that most of the variation in diversity can be explained by variation in the mutation rate. Finally, we show that the correlation between divergence and DNM density declines as increasingly divergent species are considered

    Distinct Functional Constraints Partition Sequence Conservation in a cis-Regulatory Element

    Get PDF
    Different functional constraints contribute to different evolutionary rates across genomes. To understand why some sequences evolve faster than others in a single cis-regulatory locus, we investigated function and evolutionary dynamics of the promoter of the Caenorhabditis elegans unc-47 gene. We found that this promoter consists of two distinct domains. The proximal promoter is conserved and is largely sufficient to direct appropriate spatial expression. The distal promoter displays little if any conservation between several closely related nematodes. Despite this divergence, sequences from all species confer robustness of expression, arguing that this function does not require substantial sequence conservation. We showed that even unrelated sequences have the ability to promote robust expression. A prominent feature shared by all of these robustness-promoting sequences is an AT-enriched nucleotide composition consistent with nucleosome depletion. Because general sequence composition can be maintained despite sequence turnover, our results explain how different functional constraints can lead to vastly disparate rates of sequence divergence within a promoter

    MR fluoroscopy in vascular and cardiac interventions (review)

    Get PDF
    Vascular and cardiac disease remains a leading cause of morbidity and mortality in developed and emerging countries. Vascular and cardiac interventions require extensive fluoroscopic guidance to navigate endovascular catheters. X-ray fluoroscopy is considered the current modality for real time imaging. It provides excellent spatial and temporal resolution, but is limited by exposure of patients and staff to ionizing radiation, poor soft tissue characterization and lack of quantitative physiologic information. MR fluoroscopy has been introduced with substantial progress during the last decade. Clinical and experimental studies performed under MR fluoroscopy have indicated the suitability of this modality for: delivery of ASD closure, aortic valves, and endovascular stents (aortic, carotid, iliac, renal arteries, inferior vena cava). It aids in performing ablation, creation of hepatic shunts and local delivery of therapies. Development of more MR compatible equipment and devices will widen the applications of MR-guided procedures. At post-intervention, MR imaging aids in assessing the efficacy of therapies, success of interventions. It also provides information on vascular flow and cardiac morphology, function, perfusion and viability. MR fluoroscopy has the potential to form the basis for minimally invasive image–guided surgeries that offer improved patient management and cost effectiveness
    corecore