12 research outputs found

    Single-copy entanglement in a gapped quantum spin chain

    Get PDF
    The single-copy entanglement of a given many-body system is defined [J. Eisert and M. Cramer, Phys. Rev. A. 72, 042112 (2005)] as the maximal entanglement deterministically distillable from a bipartition of a single specimen of that system. For critical (gapless) spin chains, it was recently shown that this is exactly half the von Neumann entropy [R. Orus, J. I. Latorre, J. Eisert, and M. Cramer, Phys. Rev. A 73, 060303(R) (2006)], itself defined as the entanglement distillable in the asymptotic limit: i.e. given an infinite number of copies of the system. It is an open question as to what the equivalent behaviour for gapped systems is. In this paper, I show that for the paradigmatic spin-S Affleck-Kennedy-Lieb-Tasaki chain (the archetypal gapped chain), the single-copy entanglement is equal to the von Neumann entropy: i.e. all the entanglement present may be distilled from a single specimen.Comment: Typos corrected; accepted for publication in Phys. Rev. Lett.; comments welcom

    Simulation of many-qubit quantum computation with matrix product states

    Get PDF
    Matrix product states provide a natural entanglement basis to represent a quantum register and operate quantum gates on it. This scheme can be materialized to simulate a quantum adiabatic algorithm solving hard instances of a NP-Complete problem. Errors inherent to truncations of the exact action of interacting gates are controlled by the size of the matrices in the representation. The property of finding the right solution for an instance and the expected value of the energy are found to be remarkably robust against these errors. As a symbolic example, we simulate the algorithm solving a 100-qubit hard instance, that is, finding the correct product state out of ~ 10^30 possibilities. Accumulated statistics for up to 60 qubits point at a slow growth of the average minimum time to solve hard instances with highly-truncated simulations of adiabatic quantum evolution.Comment: 5 pages, 4 figures, final versio

    Comparison of GPS analysis strategies for high-accuracy vertical land motion

    Get PDF
    Tide gauges measure sea level changes relative to land. To separate absolute changes in sea level from vertical land movements tide gauges are often co-located with Continuous GPS (CGPS). In order to achieve an accuracy of better than 1 mm/yr, as required for sea level studies in the global change context, vertical land motion needs to be determined with the same accuracy. This is an ambitious goal for CGPS and needs a carefully designed analysis strategy. We have compared the independent results from six different analysis centres, using three different GPS processing softwares and a number of different analysis strategies. Based on the comparison, we discuss the achieved accuracy and the quality of the different strategies. The data analysed are from the CGPS network of the European Sea Level Service and cover the time window from the beginning of 2000 until the end of 2003. The comparison reveals large differences in the day-to-day variations of the coordinate time series and also in the seasonal cycle contained in these. The trends show systematic differences, depending on software and strategy used. To a large extent, the latter deviations can be explained by differences in the realisation of the reference frame, while some parts may be due to other, as yet, unidentified contributions. The results suggest that the reference frame and its relation to the center of mass of the Earth system may be the main limitation in achieving the accuracy goal for the secular velocity of vertical land motion.Peer ReviewedPostprint (published version

    A short review on entanglement in quantum spin systems

    Full text link
    We review some of the recent progress on the study of entropy of entanglement in many-body quantum systems. Emphasis is placed on the scaling properties of entropy for one-dimensional multi-partite models at quantum phase transitions and, more generally, on the concept of area law. We also briefly describe the relation between entanglement and the presence of impurities, the idea of particle entanglement, the evolution of entanglement along renormalization group trajectories, the dynamical evolution of entanglement and the fate of entanglement along a quantum computation.Comment: 20 pages and 6 figures. Review article for the special issue "Entanglement entropy in extended systems" in J. Phys. A, edited by P. Calabrese, J. Cardy and B. Doyo

    Comparative testing of four ionospheric models driven with GPS measurements

    Get PDF
    In the context of the European Space Agency/European Space Operations Centre funded Study “GNSS Contribution to Next Generation Global Ionospheric Monitoring,” four ionospheric models based on GNSS data (the Electron Density Assimilative Model, EDAM; the Ionosphere Monitoring Facility, IONMON v2; the Tomographic Ionosphere model, TOMION; and the Neustrelitz TEC Models, NTCM) have been run using a controlled set of input data. Each model output has been tested against differential slant TEC (dSTEC) truth data for high (May 2002) and low (December 2006) sunspot periods. Three of the models (EDAM, TOMION, and NTCM) produce dSTEC standard deviation results that are broadly consistent with each other and with standard deviation spreads of ∌1 TECu for December 2006 and ∌1.5 TECu for May 2002. The lowest reported standard deviation across all models and all stations was 0.99 TECu (EDAM, TLSE station for December 2006 night). However, the model with the best overall dSTEC performance was TOMION which has the lowest standard deviation in 28 out of 52 test cases (13 stations, two test periods, day and night). This is probably related to the interpolation techniques used in TOMION exploiting the spatial stationarity of vertical TEC error decorrelation.Peer Reviewe

    Comparative testing of four ionospheric models driven with GPS measurements

    No full text
    In the context of the European Space Agency/European Space Operations Centre funded Study “GNSS Contribution to Next Generation Global Ionospheric Monitoring,” four ionospheric models based on GNSS data (the Electron Density Assimilative Model, EDAM; the Ionosphere Monitoring Facility, IONMON v2; the Tomographic Ionosphere model, TOMION; and the Neustrelitz TEC Models, NTCM) have been run using a controlled set of input data. Each model output has been tested against differential slant TEC (dSTEC) truth data for high (May 2002) and low (December 2006) sunspot periods. Three of the models (EDAM, TOMION, and NTCM) produce dSTEC standard deviation results that are broadly consistent with each other and with standard deviation spreads of ∌1 TECu for December 2006 and ∌1.5 TECu for May 2002. The lowest reported standard deviation across all models and all stations was 0.99 TECu (EDAM, TLSE station for December 2006 night). However, the model with the best overall dSTEC performance was TOMION which has the lowest standard deviation in 28 out of 52 test cases (13 stations, two test periods, day and night). This is probably related to the interpolation techniques used in TOMION exploiting the spatial stationarity of vertical TEC error decorrelation.Peer Reviewe

    Diseño y tecnología en comunicación para dispositivos móviles

    Get PDF
    This work aims to open a debate about how technology is changing the production processes in the media. This in order to adapt their content to a set of new broadcast formats (tablets, phones and computers). Which is characterized for what is called the ubiquity of the medium. The duality of technological change encourages a debate on the dual role of technology as a way to produce messages and how the reader assumes them as their own. This is where design and interactivity are of crucial importance for their ability to display custom messages, more adapted to the personal characteristics of the reader; multiplying the capacity of the individual to assume, through new design framework. New designs including properties like ubiquity and accessibility aimed to adapt contents to the habits of their users. Therefore we propose the urgent need to discuss about the most appropriate models for each type of formatting in the field of new devices for the consumption of reports
    corecore