15,508 research outputs found
Quantum Correlations in Nonlocal BosonSampling
Determination of the quantum nature of correlations between two spatially
separated systems plays a crucial role in quantum information science. Of
particular interest is the questions of if and how these correlations enable
quantum information protocols to be more powerful. Here, we report on a
distributed quantum computation protocol in which the input and output quantum
states are considered to be classically correlated in quantum informatics.
Nevertheless, we show that the correlations between the outcomes of the
measurements on the output state cannot be efficiently simulated using
classical algorithms. Crucially, at the same time, local measurement outcomes
can be efficiently simulated on classical computers. We show that the only
known classicality criterion violated by the input and output states in our
protocol is the one used in quantum optics, namely, phase-space
nonclassicality. As a result, we argue that the global phase-space
nonclassicality inherent within the output state of our protocol represents
true quantum correlations.Comment: 5 pages, 1 figure, comments are very welcome
Quantum Correlations and Global Coherence in Distributed Quantum Computing
Deviations from classical physics when distant quantum systems become
correlated are interesting both fundamentally and operationally. There exist
situations where the correlations enable collaborative tasks that are
impossible within the classical formalism. Here, we consider the efficiency of
quantum computation protocols compared to classical ones as a benchmark for
separating quantum and classical resources and argue that the computational
advantage of collaborative quantum protocols in the discrete variable domain
implies the nonclassicality of correlations. By analysing a toy model, it turns
out that this argument implies the existence of quantum correlations distinct
from entanglement and discord. We characterize such quantum correlations in
terms of the net global coherence resources inherent within quantum states and
show that entanglement and discord can be understood as special cases of our
general framework. Finally, we provide an operational interpretation of such
correlations as those allowing two distant parties to increase their respective
local quantum computational resources only using locally incoherent operations
and classical communication.Comment: Minor modifications and correction
What can quantum optics say about computational complexity theory?
Considering the problem of sampling from the output photon-counting
probability distribution of a linear-optical network for input Gaussian states,
we obtain results that are of interest from both quantum theory and the
computational complexity theory point of view. We derive a general formula for
calculating the output probabilities, and by considering input thermal states,
we show that the output probabilities are proportional to permanents of
positive-semidefinite Hermitian matrices. It is believed that approximating
permanents of complex matrices in general is a #P-hard problem. However, we
show that these permanents can be approximated with an algorithm in BPP^NP
complexity class, as there exists an efficient classical algorithm for sampling
from the output probability distribution. We further consider input
squeezed-vacuum states and discuss the complexity of sampling from the
probability distribution at the output.Comment: 5 pages, 1 figur
Computation of full-coverage film-cooled airfoil temperatures by two methods and comparison with high heat flux data
Two methods were used to calculate the heat flux to full-coverage film cooled airfoils and, subsequently, the airfoil wall temperatures. The calculated wall temperatures were compared to measured temperatures obtained in the Hot Section Facility operating at real engine conditions. Gas temperatures and pressures up to 1900 K and 18 atm with a Reynolds number up to 1.9 million were investigated. Heat flux was calculated by the convective heat transfer coefficient adiabatic wall method and by the superposition method which incorporates the film injection effects in the heat transfer coefficient. The results of the comparison indicate the first method can predict the experimental data reasonably well. However, superposition overpredicted the heat flux to the airfoil without a significant modification of the turbulent Prandtl number. The results suggest that additional research is required to model the physics of full-coverage film cooling where there is significant temperature/density differences between the gas and the coolant
Towards practical classical processing for the surface code: timing analysis
Topological quantum error correction codes have high thresholds and are well
suited to physical implementation. The minimum weight perfect matching
algorithm can be used to efficiently handle errors in such codes. We perform a
timing analysis of our current implementation of the minimum weight perfect
matching algorithm. Our implementation performs the classical processing
associated with an nxn lattice of qubits realizing a square surface code
storing a single logical qubit of information in a fault-tolerant manner. We
empirically demonstrate that our implementation requires only O(n^2) average
time per round of error correction for code distances ranging from 4 to 512 and
a range of depolarizing error rates. We also describe tests we have performed
to verify that it always obtains a true minimum weight perfect matching.Comment: 13 pages, 13 figures, version accepted for publicatio
Practical recommendations for reporting Fine-Gray model analyses for competing risk data
In survival analysis, a competing risk is an event whose occurrence precludes the occurrence of the primary event of interest. Outcomes in medical research are frequently subject to competing risks. In survival analysis, there are 2 key questions that can be addressed using competing risk regression models: first, which covariates affect the rate at which events occur, and second, which covariates affect the probability of an event occurring over time. The causeāspecific hazard model estimates the effect of covariates on the rate at which events occur in subjects who are currently eventāfree. Subdistribution hazard ratios obtained from the FineāGray model describe the relative effect of covariates on the subdistribution hazard function. Hence, the covariates in this model can also be interpreted as having an effect on the cumulative incidence function or on the probability of events occurring over time. We conducted a review of the use and interpretation of the FineāGray subdistribution hazard model in articles published in the medical literature in 2015. We found that many authors provided an unclear or incorrect interpretation of the regression coefficients associated with this model. An incorrect and inconsistent interpretation of regression coefficients may lead to confusion when comparing results across different studies. Furthermore, an incorrect interpretation of estimated regression coefficients can result in an incorrect understanding about the magnitude of the association between exposure and the incidence of the outcome. The objective of this article is to clarify how these regression coefficients should be reported and to propose suggestions for interpreting these coefficients
Accounting for competing risks in randomized controlled trials: a review and recommendations for improvement
In studies with survival or time-to-event outcomes, a competing risk is an event whose occurrence precludes the occurrence of the primary event of interest. Specialized statistical methods must be used to analyze survival data in the presence of competing risks. We conducted a review of randomized controlled trials with survival outcomes that were published in high-impact general medical journals. Of 40 studies that we identified, 31 (77.5%) were potentially susceptible to competing risks. However, in the majority of these studies, the potential presence of competing risks was not accounted for in the statistical analyses that were described. Of the 31 studies potentially susceptible to competing risks, 24 (77.4%) reported the results of a Kaplan-Meier survival analysis, while only five (16.1%) reported using cumulative incidence functions to estimate the incidence of the outcome over time in the presence of competing risks. The former approach will tend to result in an overestimate of the incidence of the outcome over time, while the latter approach will result in unbiased estimation of the incidence of the primary outcome over time. We provide recommendations on the analysis and reporting of randomized controlled trials with survival outcomes in the presence of competing risks. Ā© 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd
Reply to ``Comment on `Insulating Behavior of -DNA on the Micron Scale' "
In our experiment, we found that the resistance of vacuum-dried -DNA
exceeds at 295 K. Bechhoefer and Sen have raised a number of
objections to our conclusion. We provide counter arguments to support our
original conclusion.Comment: 1 page reply to comment, 1 figur
- ā¦