13,249 research outputs found

    NP-hardness of decoding quantum error-correction codes

    Full text link
    Though the theory of quantum error correction is intimately related to the classical coding theory, in particular, one can construct quantum error correction codes (QECCs) from classical codes with the dual containing property, this does not necessarily imply that the computational complexity of decoding QECCs is the same as their classical counterparts. Instead, decoding QECCs can be very much different from decoding classical codes due to the degeneracy property. Intuitively, one expect degeneracy would simplify the decoding since two different errors might not and need not be distinguished in order to correct them. However, we show that general quantum decoding problem is NP-hard regardless of the quantum codes being degenerate or non-degenerate. This finding implies that no considerably fast decoding algorithm exists for the general quantum decoding problems, and suggests the existence of a quantum cryptosystem based on the hardness of decoding QECCs.Comment: 5 pages, no figure. Final version for publicatio

    Highly Efficient Midinfrared On-Chip Electrical Generation of Graphene Plasmons by Inelastic Electron Tunneling Excitation

    Get PDF
    Inelastic electron tunneling provides a low-energy pathway for the excitation of surface plasmons and light emission. We theoretically investigate tunnel junctions based on metals and graphene. We show that graphene is potentially a highly efficient material for tunneling excitation of plasmons because of its narrow plasmon linewidths, strong emission, and large tunability in the midinfrared wavelength regime. Compared to gold and silver, the enhancement can be up to 10 times for similar wavelengths and up to 5 orders at their respective plasmon operating wavelengths. Tunneling excitation of graphene plasmons promises an efficient technology for on-chip electrical generation and manipulation of plasmons for graphene-based optoelectronics and nanophotonic integrated circuits.Comment: 12 pages, 7 figure

    Scientific basis for safely shutting in the Macondo Well after the April 20, 2010 Deepwater Horizon blowout

    Get PDF
    As part of the government response to the Deepwater Horizon blowout, a Well Integrity Team evaluated the geologic hazards of shutting in the Macondo Well at the seafloor and determined the conditions under which it could safely be undertaken. Of particular concern was the possibility that, under the anticipated high shut-in pressures, oil could leak out of the well casing below the seafloor. Such a leak could lead to new geologic pathways for hydrocarbon release to the Gulf of Mexico. Evaluating this hazard required analyses of 2D and 3D seismic surveys, seafloor bathymetry, sediment properties, geophysical well logs, and drilling data to assess the geological, hydrological, and geomechanical conditions around the Macondo Well. After the well was successfully capped and shut in on July 15, 2010, a variety of monitoring activities were used to assess subsurface well integrity. These activities included acquisition of wellhead pressure data, marine multichannel seismic pro- files, seafloor and water-column sonar surveys, and wellhead visual/acoustic monitoring. These data showed that the Macondo Well was not leaking after shut in, and therefore, it could remain safely shut until reservoir pressures were suppressed (killed) with heavy drilling mud and the well was sealed with cement

    Validation of nonlinear PCA

    Full text link
    Linear principal component analysis (PCA) can be extended to a nonlinear PCA by using artificial neural networks. But the benefit of curved components requires a careful control of the model complexity. Moreover, standard techniques for model selection, including cross-validation and more generally the use of an independent test set, fail when applied to nonlinear PCA because of its inherent unsupervised characteristics. This paper presents a new approach for validating the complexity of nonlinear PCA models by using the error in missing data estimation as a criterion for model selection. It is motivated by the idea that only the model of optimal complexity is able to predict missing values with the highest accuracy. While standard test set validation usually favours over-fitted nonlinear PCA models, the proposed model validation approach correctly selects the optimal model complexity.Comment: 12 pages, 5 figure

    Unsteady flow in a supercritical supersonic diffuser

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/77051/1/AIAA-10045-786.pd

    Electronic and magnetic properties of the kagome systems YBaCo4O7 and YBaCo3MO7 (M=Al, Fe)

    Full text link
    We present a combined experimental and theoretical x-ray absorption spectroscopy (XAS) study of the new class of cobaltates YBaCo4O7 and YBaCo3MO7 (M= Al, Fe). The focus is on the local electronic and magnetic properties of the transition metal ions in these geometrically frustrated kagome compounds. For the mixed valence cobaltate YBaCo4O7, both the Co2+ and Co3+ are found to be in the high spin state. The stability of these high spin states in tetrahedral coordination is compared with those in the more studied case of octahedral coordination. For the new compound YBaCo3FeO7, we find exclusively Co2+ and Fe3+ as charge states

    Don't bleach chaotic data

    Full text link
    A common first step in time series signal analysis involves digitally filtering the data to remove linear correlations. The residual data is spectrally white (it is ``bleached''), but in principle retains the nonlinear structure of the original time series. It is well known that simple linear autocorrelation can give rise to spurious results in algorithms for estimating nonlinear invariants, such as fractal dimension and Lyapunov exponents. In theory, bleached data avoids these pitfalls. But in practice, bleaching obscures the underlying deterministic structure of a low-dimensional chaotic process. This appears to be a property of the chaos itself, since nonchaotic data are not similarly affected. The adverse effects of bleaching are demonstrated in a series of numerical experiments on known chaotic data. Some theoretical aspects are also discussed.Comment: 12 dense pages (82K) of ordinary LaTeX; uses macro psfig.tex for inclusion of figures in text; figures are uufile'd into a single file of size 306K; the final dvips'd postscript file is about 1.3mb Replaced 9/30/93 to incorporate final changes in the proofs and to make the LaTeX more portable; the paper will appear in CHAOS 4 (Dec, 1993

    Controlling orbital moment and spin orientation in CoO layers by strain

    Get PDF
    We have observed that CoO films grown on different substrates show dramatic differences in their magnetic properties. Using polarization dependent x-ray absorption spectroscopy at the Co L2,3_{2,3} edges, we revealed that the magnitude and orientation of the magnetic moments strongly depend on the strain in the films induced by the substrate. We presented a quantitative model to explain how strain together with the spin-orbit interaction determine the 3d orbital occupation, the magnetic anisotropy, as well as the spin and orbital contributions to the magnetic moments. Control over the sign and direction of the strain may therefore open new opportunities for applications in the field of exchange bias in multilayered magnetic films
    • …
    corecore