119 research outputs found

    Grover's Quantum Search Algorithm for an Arbitrary Initial Mixed State

    Full text link
    The Grover quantum search algorithm is generalized to deal with an arbitrary mixed initial state. The probability to measure a marked state as a function of time is calculated, and found to depend strongly on the specific initial state. The form of the function, though, remains as it is in the case of initial pure state. We study the role of the von Neumann entropy of the initial state, and show that the entropy cannot be a measure for the usefulness of the algorithm. We give few examples and show that for some extremely mixed initial states carrying high entropy, the generalized Grover algorithm is considerably faster than any classical algorithm.Comment: 4 pages. See http://www.cs.technion.ac.il/~danken/MSc-thesis.pdf for extended discussio

    Experimental investigation of classical and quantum correlations under decoherence

    Full text link
    It is well known that many operations in quantum information processing depend largely on a special kind of quantum correlation, that is, entanglement. However, there are also quantum tasks that display the quantum advantage without entanglement. Distinguishing classical and quantum correlations in quantum systems is therefore of both fundamental and practical importance. In consideration of the unavoidable interaction between correlated systems and the environment, understanding the dynamics of correlations would stimulate great interest. In this study, we investigate the dynamics of different kinds of bipartite correlations in an all-optical experimental setup. The sudden change in behaviour in the decay rates of correlations and their immunity against certain decoherences are shown. Moreover, quantum correlation is observed to be larger than classical correlation, which disproves the early conjecture that classical correlation is always greater than quantum correlation. Our observations may be important for quantum information processing.Comment: 7 pages, 4 figures, to appear in Nature Communication

    Large scale variation in the rate of germ-line de novo mutation, base composition, divergence and diversity in humans

    Get PDF
    It has long been suspected that the rate of mutation varies across the human genome at a large scale based on the divergence between humans and other species. However, it is now possible to directly investigate this question using the large number of de novo mutations (DNMs) that have been discovered in humans through the sequencing of trios. We investi- gate a number of questions pertaining to the distribution of mutations using more than 130,000 DNMs from three large datasets. We demonstrate that the amount and pattern of variation differs between datasets at the 1MB and 100KB scales probably as a consequence of differences in sequencing technology and processing. In particular, datasets show differ- ent patterns of correlation to genomic variables such as replication time. Never-the-less there are many commonalities between datasets, which likely represent true patterns. We show that there is variation in the mutation rate at the 100KB, 1MB and 10MB scale that can- not be explained by variation at smaller scales, however the level of this variation is modest at large scales–at the 1MB scale we infer that ~90% of regions have a mutation rate within 50% of the mean. Different types of mutation show similar levels of variation and appear to vary in concert which suggests the pattern of mutation is relatively constant across the genome. We demonstrate that variation in the mutation rate does not generate large-scale variation in GC-content, and hence that mutation bias does not maintain the isochore struc- ture of the human genome. We find that genomic features explain less than 40% of the explainable variance in the rate of DNM. As expected the rate of divergence between spe- cies is correlated to the rate of DNM. However, the correlations are weaker than expected if all the variation in divergence was due to variation in the mutation rate. We provide evidence that this is due the effect of biased gene conversion on the probability that a mutation will become fixed. In contrast to divergence, we find that most of the variation in diversity can be explained by variation in the mutation rate. Finally, we show that the correlation between divergence and DNM density declines as increasingly divergent species are considered

    Gut CD4+ T cell phenotypes are a continuum molded by microbes, not by TH archetypes

    Get PDF
    CD4+ effector lymphocytes (Teff) are traditionally classified by the cytokines they produce. To determine the states that Teff cells actually adopt in frontline tissues in vivo, we applied single-cell transcriptome and chromatin analyses to colonic Teff cells in germ-free or conventional mice or in mice after challenge with a range of phenotypically biasing microbes. Unexpected subsets were marked by the expression of the interferon (IFN) signature or myeloid-specific transcripts, but transcriptome or chromatin structure could not resolve discrete clusters fitting classic helper T cell (TH) subsets. At baseline or at different times of infection, transcripts encoding cytokines or proteins commonly used as TH markers were distributed in a polarized continuum, which was functionally validated. Clones derived from single progenitors gave rise to both IFN-Îł- and interleukin (IL)-17-producing cells. Most of the transcriptional variance was tied to the infecting agent, independent of the cytokines produced, and chromatin variance primarily reflected activities of activator protein (AP)-1 and IFN-regulatory factor (IRF) transcription factor (TF) families, not the canonical subset master regulators T-bet, GATA3 or RORÎł

    Distinct Functional Constraints Partition Sequence Conservation in a cis-Regulatory Element

    Get PDF
    Different functional constraints contribute to different evolutionary rates across genomes. To understand why some sequences evolve faster than others in a single cis-regulatory locus, we investigated function and evolutionary dynamics of the promoter of the Caenorhabditis elegans unc-47 gene. We found that this promoter consists of two distinct domains. The proximal promoter is conserved and is largely sufficient to direct appropriate spatial expression. The distal promoter displays little if any conservation between several closely related nematodes. Despite this divergence, sequences from all species confer robustness of expression, arguing that this function does not require substantial sequence conservation. We showed that even unrelated sequences have the ability to promote robust expression. A prominent feature shared by all of these robustness-promoting sequences is an AT-enriched nucleotide composition consistent with nucleosome depletion. Because general sequence composition can be maintained despite sequence turnover, our results explain how different functional constraints can lead to vastly disparate rates of sequence divergence within a promoter

    MR fluoroscopy in vascular and cardiac interventions (review)

    Get PDF
    Vascular and cardiac disease remains a leading cause of morbidity and mortality in developed and emerging countries. Vascular and cardiac interventions require extensive fluoroscopic guidance to navigate endovascular catheters. X-ray fluoroscopy is considered the current modality for real time imaging. It provides excellent spatial and temporal resolution, but is limited by exposure of patients and staff to ionizing radiation, poor soft tissue characterization and lack of quantitative physiologic information. MR fluoroscopy has been introduced with substantial progress during the last decade. Clinical and experimental studies performed under MR fluoroscopy have indicated the suitability of this modality for: delivery of ASD closure, aortic valves, and endovascular stents (aortic, carotid, iliac, renal arteries, inferior vena cava). It aids in performing ablation, creation of hepatic shunts and local delivery of therapies. Development of more MR compatible equipment and devices will widen the applications of MR-guided procedures. At post-intervention, MR imaging aids in assessing the efficacy of therapies, success of interventions. It also provides information on vascular flow and cardiac morphology, function, perfusion and viability. MR fluoroscopy has the potential to form the basis for minimally invasive image–guided surgeries that offer improved patient management and cost effectiveness

    The Fukushima Daiichi Accident

    Get PDF
    The Fukushima Daiichi Accident consists of a Report by the IAEA Director General and five technical volumes. It is the result of an extensive international collaborative effort involving five working groups with about 180 experts from 42 Member States with and without nuclear power programmes and several international bodies. It provides a description of the accident and its causes, evolution and consequences, based on the evaluation of data and information from a large number of sources available at the time of writing. The set contains six printed parts and five supplementary CD-ROMs. Contents: Report by the Director General; Technical Volume 1/5, Description and Context of the Accident; Technical Volume 2/5, Safety Assessment; Technical Volume 3/5, Emergency Preparedness and Response; Technical Volume 4/5, Radiological Consequences; Technical Volume 5/5, Post-accident Recovery; Annexes. The JRC contributed to volumes 1,2 and 3, which are attached.JRC.F.5-Nuclear Reactor Safety Assessmen

    Data Descriptor: A global multiproxy database for temperature reconstructions of the Common Era

    Get PDF
    Reproducible climate reconstructions of the Common Era (1 CE to present) are key to placing industrial-era warming into the context of natural climatic variability. Here we present a community-sourced database of temperature-sensitive proxy records from the PAGES2k initiative. The database gathers 692 records from 648 locations, including all continental regions and major ocean basins. The records are from trees, ice, sediment, corals, speleothems, documentary evidence, and other archives. They range in length from 50 to 2000 years, with a median of 547 years, while temporal resolution ranges from biweekly to centennial. Nearly half of the proxy time series are significantly correlated with HadCRUT4.2 surface temperature over the period 1850-2014. Global temperature composites show a remarkable degree of coherence between high-and low-resolution archives, with broadly similar patterns across archive types, terrestrial versus marine locations, and screening criteria. The database is suited to investigations of global and regional temperature variability over the Common Era, and is shared in the Linked Paleo Data (LiPD) format, including serializations in Matlab, R and Python.(TABLE)Since the pioneering work of D'Arrigo and Jacoby1-3, as well as Mann et al. 4,5, temperature reconstructions of the Common Era have become a key component of climate assessments6-9. Such reconstructions depend strongly on the composition of the underlying network of climate proxies10, and it is therefore critical for the climate community to have access to a community-vetted, quality-controlled database of temperature-sensitive records stored in a self-describing format. The Past Global Changes (PAGES) 2k consortium, a self-organized, international group of experts, recently assembled such a database, and used it to reconstruct surface temperature over continental-scale regions11 (hereafter, ` PAGES2k-2013').This data descriptor presents version 2.0.0 of the PAGES2k proxy temperature database (Data Citation 1). It augments the PAGES2k-2013 collection of terrestrial records with marine records assembled by the Ocean2k working group at centennial12 and annual13 time scales. In addition to these previously published data compilations, this version includes substantially more records, extensive new metadata, and validation. Furthermore, the selection criteria for records included in this version are applied more uniformly and transparently across regions, resulting in a more cohesive data product.This data descriptor describes the contents of the database, the criteria for inclusion, and quantifies the relation of each record with instrumental temperature. In addition, the paleotemperature time series are summarized as composites to highlight the most salient decadal-to centennial-scale behaviour of the dataset and check mutual consistency between paleoclimate archives. We provide extensive Matlab code to probe the database-processing, filtering and aggregating it in various ways to investigate temperature variability over the Common Era. The unique approach to data stewardship and code-sharing employed here is designed to enable an unprecedented scale of investigation of the temperature history of the Common Era, by the scientific community and citizen-scientists alike
    • 

    corecore