2,538 research outputs found

    A simple family of nonadditive quantum codes

    Full text link
    Most known quantum codes are additive, meaning the codespace can be described as the simultaneous eigenspace of an abelian subgroup of the Pauli group. While in some scenarios such codes are strictly suboptimal, very little is understood about how to construct nonadditive codes with good performance. Here we present a family of nonadditive quantum codes for all odd blocklengths, n, that has a particularly simple form. Our codes correct single qubit erasures while encoding a higher dimensional space than is possible with an additive code or, for n of 11 or greater, any previous codes.Comment: 3 pages, new version with slight clarifications, no results are change

    State Discrimination with Post-Measurement Information

    Get PDF
    We introduce a new state discrimination problem in which we are given additional information about the state after the measurement, or more generally, after a quantum memory bound applies. In particular, the following special case plays an important role in quantum cryptographic protocols in the bounded storage model: Given a string x encoded in an unknown basis chosen from a set of mutually unbiased bases, you may perform any measurement, but then store at most q qubits of quantum information. Later on, you learn which basis was used. How well can you compute a function f(x) of x, given the initial measurement outcome, the q qubits and the additional basis information? We first show a lower bound on the success probability for any balanced function, and any number of mutually unbiased bases, beating the naive strategy of simply guessing the basis. We then show that for two bases, any Boolean function f(x) can be computed perfectly if you are allowed to store just a single qubit, independent of the number of possible input strings x. However, we show how to construct three bases, such that you need to store all qubits in order to compute f(x) perfectly. We then investigate how much advantage the additional basis information can give for a Boolean function. To this end, we prove optimal bounds for the success probability for the AND and the XOR function for up to three mutually unbiased bases. Our result shows that the gap in success probability can be maximal: without the basis information, you can never do better than guessing the basis, but with this information, you can compute f(x) perfectly. We also exhibit an example where the extra information does not give any advantage at all.Comment: twentynine pages, no figures, equations galore. v2 thirtyone pages, one new result w.r.t. v

    Quantifying statistical uncertainty in the attribution of human influence on severe weather

    Get PDF
    Event attribution in the context of climate change seeks to understand the role of anthropogenic greenhouse gas emissions on extreme weather events, either specific events or classes of events. A common approach to event attribution uses climate model output under factual (real-world) and counterfactual (world that might have been without anthropogenic greenhouse gas emissions) scenarios to estimate the probabilities of the event of interest under the two scenarios. Event attribution is then quantified by the ratio of the two probabilities. While this approach has been applied many times in the last 15 years, the statistical techniques used to estimate the risk ratio based on climate model ensembles have not drawn on the full set of methods available in the statistical literature and have in some cases used and interpreted the bootstrap method in non-standard ways. We present a precise frequentist statistical framework for quantifying the effect of sampling uncertainty on estimation of the risk ratio, propose the use of statistical methods that are new to event attribution, and evaluate a variety of methods using statistical simulations. We conclude that existing statistical methods not yet in use for event attribution have several advantages over the widely-used bootstrap, including better statistical performance in repeated samples and robustness to small estimated probabilities. Software for using the methods is available through the climextRemes package available for R or Python. While we focus on frequentist statistical methods, Bayesian methods are likely to be particularly useful when considering sources of uncertainty beyond sampling uncertainty.Comment: 41 pages, 11 figures, 1 tabl

    A time-dependent Tsirelson's bound from limits on the rate of information gain in quantum systems

    Full text link
    We consider the problem of distinguishing between a set of arbitrary quantum states in a setting in which the time available to perform the measurement is limited. We provide simple upper bounds on how well we can perform state discrimination in a given time as a function of either the average energy or the range of energies available during the measurement. We exhibit a specific strategy that nearly attains this bound. Finally, we consider several applications of our result. First, we obtain a time-dependent Tsirelson's bound that limits the extent of the Bell inequality violation that can be in principle be demonstrated in a given time t. Second, we obtain a Margolus-Levitin type bound when considering the special case of distinguishing orthogonal pure states.Comment: 15 pages, revtex, 1 figur

    Long term measurements of submicrometer urban aerosols: statistical analysis for correlations with meteorological conditions and trace gases

    Get PDF
    Long-term measurements (over 4 years) of particle number size distributions (submicrometer particles, 3-800 nm in diameter), trace gases (NO, NO<sub>2</sub>, and O<sub>3</sub>), and meteorological parameters (global radiation, wind speed and direction, atmospheric pressure, etc.) were taken in a moderately polluted site in the city of Leipzig (Germany). The resulting complex data set was analyzed with respect to seasonal, weekly, and diurnal variation of the submicrometer aerosol. Car traffic produced a peak in the number size distribution at around 20 nm particle diameter during morning rush hour on weekdays. A second peak at 10-15 nm particle diameter occurred around noon during summer, confirmed by high correlation between concentration of particles less than 20 nm and the global radiation. This new-particle formation at noon was correlated with the amount of global radiation. A high concentration of accumulation mode particles (between 100 and 800 nm), which are associated with large particle-surface area, might prevent this formation. Such high particle concentration in the ultrafine region (particles smaller than 20 nm in diameter) was not detected in the particle mass, and thus, particle mass concentration is not suitable for determining the diurnal patterns of particles. In summer, statistical time series analysis showed a cyclic pattern of ultrafine particles with a period of one day and confirmed the correlation with global radiation. Principal component analysis (PCA) revealed a strong correlation between the particle concentration for 20-800 nm particles and the NO- and NO<sub>2</sub>-concentrations, indicating the influence of combustion processes on this broad size range, in particular during winter. In addition, PCA also revealed that particle concentration depended on meteorological conditions such as wind speed and wind direction, although the dependence differed with particle size class

    Entropic uncertainty relations and locking: tight bounds for mutually unbiased bases

    Full text link
    We prove tight entropic uncertainty relations for a large number of mutually unbiased measurements. In particular, we show that a bound derived from the result by Maassen and Uffink for 2 such measurements can in fact be tight for up to sqrt{d} measurements in mutually unbiased bases. We then show that using more mutually unbiased bases does not always lead to a better locking effect. We prove that the optimal bound for the accessible information using up to sqrt{d} specific mutually unbiased bases is log d/2, which is the same as can be achieved by using only two bases. Our result indicates that merely using mutually unbiased bases is not sufficient to achieve a strong locking effect, and we need to look for additional properties.Comment: 9 pages, RevTeX, v3: complete rewrite, new title, many new results, v4: minor changes, published versio

    A strong converse for classical channel coding using entangled inputs

    Full text link
    A fully general strong converse for channel coding states that when the rate of sending classical information exceeds the capacity of a quantum channel, the probability of correctly decoding goes to zero exponentially in the number of channel uses, even when we allow code states which are entangled across several uses of the channel. Such a statement was previously only known for classical channels and the quantum identity channel. By relating the problem to the additivity of minimum output entropies, we show that a strong converse holds for a large class of channels, including all unital qubit channels, the d-dimensional depolarizing channel and the Werner-Holevo channel. This further justifies the interpretation of the classical capacity as a sharp threshold for information-transmission.Comment: 9 pages, revte

    Quantifying the effect of interannual ocean variability on the attribution of extreme climate events to human influence

    Full text link
    In recent years, the climate change research community has become highly interested in describing the anthropogenic influence on extreme weather events, commonly termed "event attribution." Limitations in the observational record and in computational resources motivate the use of uncoupled, atmosphere/land-only climate models with prescribed ocean conditions run over a short period, leading up to and including an event of interest. In this approach, large ensembles of high-resolution simulations can be generated under factual observed conditions and counterfactual conditions that might have been observed in the absence of human interference; these can be used to estimate the change in probability of the given event due to anthropogenic influence. However, using a prescribed ocean state ignores the possibility that estimates of attributable risk might be a function of the ocean state. Thus, the uncertainty in attributable risk is likely underestimated, implying an over-confidence in anthropogenic influence. In this work, we estimate the year-to-year variability in calculations of the anthropogenic contribution to extreme weather based on large ensembles of atmospheric model simulations. Our results both quantify the magnitude of year-to-year variability and categorize the degree to which conclusions of attributable risk are qualitatively affected. The methodology is illustrated by exploring extreme temperature and precipitation events for the northwest coast of South America and northern-central Siberia; we also provides results for regions around the globe. While it remains preferable to perform a full multi-year analysis, the results presented here can serve as an indication of where and when attribution researchers should be concerned about the use of atmosphere-only simulations

    Chronic kidney disease and acute myocardial infarction: the story after 1 year

    Get PDF
    When chronic kidney disease (CKD) is part of the clinical history for a patient with acute myocardial infarction, the interventional cardiologist experiences an increased anxiety level. An acute myocardial infarction with renal disease requires more attention to dye load and fluid status, and general opinion exists regarding the negative outcomes of these “sicker” patients. Mark Navarro and colleagues completed a thorough study from a different angle. We know the interventionalists’ concerns are justified acutely as these CKD patients have a higher level of inpatient complications, but these authors chose to look at the patients 1 year from their event to determine if a relationship existed between CKD and the patients’ health status. In the current climate of patient centered care and outcomes, the study is very timely
    corecore