155 research outputs found

    Grover's Quantum Search Algorithm for an Arbitrary Initial Mixed State

    Full text link
    The Grover quantum search algorithm is generalized to deal with an arbitrary mixed initial state. The probability to measure a marked state as a function of time is calculated, and found to depend strongly on the specific initial state. The form of the function, though, remains as it is in the case of initial pure state. We study the role of the von Neumann entropy of the initial state, and show that the entropy cannot be a measure for the usefulness of the algorithm. We give few examples and show that for some extremely mixed initial states carrying high entropy, the generalized Grover algorithm is considerably faster than any classical algorithm.Comment: 4 pages. See http://www.cs.technion.ac.il/~danken/MSc-thesis.pdf for extended discussio

    Experimental investigation of classical and quantum correlations under decoherence

    Full text link
    It is well known that many operations in quantum information processing depend largely on a special kind of quantum correlation, that is, entanglement. However, there are also quantum tasks that display the quantum advantage without entanglement. Distinguishing classical and quantum correlations in quantum systems is therefore of both fundamental and practical importance. In consideration of the unavoidable interaction between correlated systems and the environment, understanding the dynamics of correlations would stimulate great interest. In this study, we investigate the dynamics of different kinds of bipartite correlations in an all-optical experimental setup. The sudden change in behaviour in the decay rates of correlations and their immunity against certain decoherences are shown. Moreover, quantum correlation is observed to be larger than classical correlation, which disproves the early conjecture that classical correlation is always greater than quantum correlation. Our observations may be important for quantum information processing.Comment: 7 pages, 4 figures, to appear in Nature Communication

    The elusive MAESTRO gene: Its human reproductive tissue-specific expression pattern

    Get PDF
    published_or_final_versio

    The role of input materials in shallow seismogenic slip and forearc plateau development: International Ocean Discovery Program Expedition 362 Preliminary Report Sumatra Seismogenic Zone

    Get PDF
    Drilling the input materials of the north Sumatran subduction zone, part of the 5000 km long Sunda subduction zone system and the origin of the Mw ∼9.2 earthquake and tsunami that devastated coastal communities around the Indian Ocean in 2004, was designed to groundtruth the material properties causing unexpectedly shallow seismogenic slip and a distinctive forearc prism structure. The intriguing seismogenic behavior and forearc structure are not well explained by existing models or by relationships observed at margins where seismogenic slip typically occurs farther landward. The input materials of the north Sumatran subduction zone are a distinctively thick (as thick as 4-5 km) succession of primarily Bengal-Nicobar Fan-related sediments. The correspondence between the 2004 rupture location and the overlying prism plateau, as well as evidence for a strengthened input section, suggest the input materials are key to driving the distinctive slip behavior and long-term forearc structure. During Expedition 362, two sites on the Indian oceanic plate ∼250 km southwest of the subduction zone, Sites U1480 and U1481, were drilled, cored, and logged to a maximum depth of 1500 meters below seafloor. The succession of sediment/rocks that will develop into the plate boundary detachment and will drive growth of the forearc were sampled, and their progressive mechanical, frictional, and hydrogeological property evolution will be analyzed through postcruise experimental and modeling studies. Large penetration depths with good core recovery and successful wireline logging in the challenging submarine fan materials will enable evaluation of the role of thick sedimentar y subduction zone input sections in driving shallow slip and amplifying earthquake and tsunami magnitudes, at the Sunda subduction zone and globally at other subduction zones where submarine fan-influenced sections are being subducted

    Recommendations for the use of Serious Games in people with Alzheimer's Disease, related disorders and frailty.

    Get PDF
    International audienceAlzheimer's disease and other related disorders (ADRD) represent a major challenge for health care systems within the aging population. It is therefore important to develop better instruments to assess the disease severity and progression, as well as to improve its treatment, stimulation, and rehabilitation. This is the underlying idea for the development of Serious Games (SG). These are digital applications specially adapted for purposes other than entertaining; such as rehabilitation, training and education. Recently, there has been an increase of interest in the use of SG targeting patients with ADRD. However, this field is completely uncharted, and the clinical, ethical, economic and research impact of the employment of SG in these target populations has never been systematically addressed. The aim of this paper is to systematically analyze the Strengths, Weaknesses, Opportunities, and Threats (SWOT) of employing SG with patients with ADRD in order to provide practical recommendations for the development and use of SG in these populations. These analyses and recommendations were gathered, commented on and validated during a 2-round workshop in the context of the 2013 Clinical Trial of Alzheimer's Disease (CTAD) conference, and endorsed by stakeholders in the field. The results revealed that SG may offer very useful tools for professionals involved in the care of patients suffering from ADRD. However, more interdisciplinary work should be done in order to create SG specifically targeting these populations. Furthermore, in order to acquire more academic and professional credibility and acceptance, it will be necessary to invest more in research targeting efficacy and feasibility. Finally, the emerging ethical challenges should be considered a priority

    Distinct Functional Constraints Partition Sequence Conservation in a cis-Regulatory Element

    Get PDF
    Different functional constraints contribute to different evolutionary rates across genomes. To understand why some sequences evolve faster than others in a single cis-regulatory locus, we investigated function and evolutionary dynamics of the promoter of the Caenorhabditis elegans unc-47 gene. We found that this promoter consists of two distinct domains. The proximal promoter is conserved and is largely sufficient to direct appropriate spatial expression. The distal promoter displays little if any conservation between several closely related nematodes. Despite this divergence, sequences from all species confer robustness of expression, arguing that this function does not require substantial sequence conservation. We showed that even unrelated sequences have the ability to promote robust expression. A prominent feature shared by all of these robustness-promoting sequences is an AT-enriched nucleotide composition consistent with nucleosome depletion. Because general sequence composition can be maintained despite sequence turnover, our results explain how different functional constraints can lead to vastly disparate rates of sequence divergence within a promoter

    Large scale variation in the rate of germ-line de novo mutation, base composition, divergence and diversity in humans

    Get PDF
    It has long been suspected that the rate of mutation varies across the human genome at a large scale based on the divergence between humans and other species. However, it is now possible to directly investigate this question using the large number of de novo mutations (DNMs) that have been discovered in humans through the sequencing of trios. We investi- gate a number of questions pertaining to the distribution of mutations using more than 130,000 DNMs from three large datasets. We demonstrate that the amount and pattern of variation differs between datasets at the 1MB and 100KB scales probably as a consequence of differences in sequencing technology and processing. In particular, datasets show differ- ent patterns of correlation to genomic variables such as replication time. Never-the-less there are many commonalities between datasets, which likely represent true patterns. We show that there is variation in the mutation rate at the 100KB, 1MB and 10MB scale that can- not be explained by variation at smaller scales, however the level of this variation is modest at large scales–at the 1MB scale we infer that ~90% of regions have a mutation rate within 50% of the mean. Different types of mutation show similar levels of variation and appear to vary in concert which suggests the pattern of mutation is relatively constant across the genome. We demonstrate that variation in the mutation rate does not generate large-scale variation in GC-content, and hence that mutation bias does not maintain the isochore struc- ture of the human genome. We find that genomic features explain less than 40% of the explainable variance in the rate of DNM. As expected the rate of divergence between spe- cies is correlated to the rate of DNM. However, the correlations are weaker than expected if all the variation in divergence was due to variation in the mutation rate. We provide evidence that this is due the effect of biased gene conversion on the probability that a mutation will become fixed. In contrast to divergence, we find that most of the variation in diversity can be explained by variation in the mutation rate. Finally, we show that the correlation between divergence and DNM density declines as increasingly divergent species are considered

    Childhood exposure due to the Chernobyl accident and thyroid cancer risk in contaminated areas of Belarus and Russia

    Get PDF
    The thyroid dose due to 131I releases during the Chernobyl accident was reconstructed for children and adolescents in two cities and 2122 settlements in Belarus, and in one city and 607 settlements in the Bryansk district of the Russian Federation. In this area, which covers the two high contamination spots in the two countries following the accident, data on thyroid cancer incidence during the period 1991-1995 were analysed in the light of possible increased thyroid surveillance. Two methods of risk analysis were applied: Poisson regression with results for the single settlements and Monte Carlo (MC) calculations for results in larger areas or sub-populations. Best estimates of both methods agreed well. Poisson regression estimates of 95% confidence intervals (CIs) were considerably smaller than the MC results, which allow for extra-Poisson uncertainties due to reconstructed doses and the background thyroid cancer incidence. The excess absolute risk per unit thyroid dose (EARPD) for the birth cohort 1971-1985 by the MC analysis was 2.1 (95% CI 1.0-4.5) cases per 10(4) person-year Gy. The point estimate is lower by a factor of two than that observed in a pooled study of thyroid cancer risk after external exposures. The excess relative risk per unit thyroid dose was 23 (95% CI 8.6-82) Gy(-1). No significant differences between countries or cities and rural areas were found. In the lowest dose group of the settlements with an average thyroid dose of 0.05 Gy the risk was statistically significantly elevated. Dependencies of risks on age-at-exposure and on gender are consistent with findings after external exposures
    corecore