169 research outputs found
Efficient Simulation of Leakage Errors in Quantum Error Correcting Codes Using Tensor Network Methods
Leakage errors, in which a qubit is excited to a level outside the qubit
subspace, represent a significant obstacle in the development of robust quantum
computers. We present a computationally efficient simulation methodology for
studying leakage errors in quantum error correcting codes (QECCs) using tensor
network methods, specifically Matrix Product States (MPS). Our approach enables
the simulation of various leakage processes, including thermal noise and
coherent errors, without approximations (such as the Pauli twirling
approximation) that can lead to errors in the estimation of the logical error
rate. We apply our method to two QECCs: the one-dimensional (1D) repetition
code and a thin surface code. By leveraging the small amount of
entanglement generated during the error correction process, we are able to
study large systems, up to a few hundred qudits, over many code cycles. We
consider a realistic noise model of leakage relevant to superconducting qubits
to evaluate code performance and a variety of leakage removal strategies. Our
numerical results suggest that appropriate leakage removal is crucial,
especially when the code distance is large.Comment: 14 pages, 12 figure
Seasonal variation of air transport in the Antarctic and Atmospheric Circulation in 1997
To better understand how present and past climates at Syowa Station, Antarctica relate to climate elsewhere, we analyzed the tropospheric air transport to Syowa Station for the year 1997 using a dataset from the European Centre for Medium Range Weather Forecasts(ECMWF). The five-day trajectories of the air parcels were estimated and analyzed. In the middle troposphere in winter, air parcels were usually from the lower troposphere over the Atlantic. However, in January, most of the air parcels came from latitudes higher than 60°S . The trajectories had little vertical motion and were associated with a low pressure system that forms along the coastal region of Antarctica only in summer. In the lower troposphere, trajectories could be classified as originating in one of three regions: the Southern Ocean, the continental interior, and the east coast. In contrast to the middle troposphere, air parcels from the Southern Ocean had the lowest frequency, irrespective of the time of year. This is partially due to a low pressure system that blocks air parcels from outside the continent. Most trajectories are affected by the drainage flow. An amplified quasi-stationary planetary wave for September to November and a blocking circulation in June make trajectories pass over Antarctica
Noise propagation in hybrid tensor networks
The hybrid tensor network (HTN) method is a general framework allowing for
the construction of an effective wavefunction with the combination of classical
tensors and quantum tensors, i.e., amplitudes of quantum states. In particular,
hybrid tree tensor networks (HTTNs) are very useful for simulating larger
systems beyond the available size of the quantum hardware. However, while the
realistic quantum states in NISQ hardware are highly likely to be noisy, this
framework is formulated for pure states. In this work, as well as discussing
the relevant methods, i.e., Deep VQE and entanglement forging under the
framework of HTTNs, we investigate the noisy HTN states by introducing the
expansion operator for providing the description of the expansion of the size
of simulated quantum systems and the noise propagation. This framework enables
the general tree HTN states to be explicitly represented and their physicality
to be discussed. We also show that the expectation value of a measured
observable exponentially vanishes with the number of contracted quantum
tensors. Our work will lead to providing the noise-resilient construction of
HTN states.Comment: 20 pages, 8 figure
Virtual quantum error detection
Quantum error correction and quantum error detection necessitate syndrome
measurements to detect errors. Performing syndrome measurements for each
stabilizer generator can be a significant overhead, considering the fact that
the readout fidelity in the current quantum hardware is generally lower than
gate fidelity. Here, by generalizing a quantum error mitigation method known as
symmetry expansion, we propose a protocol called virtual quantum error
detection (VQED). This method virtually allows for evaluating computation
results corresponding to post-selected quantum states obtained through quantum
error detection during circuit execution, without implementing syndrome
measurements. Unlike conventional quantum error detection, which requires the
implementation of Hadamard test circuits for each stabilizer generator, our
VQED protocol can be performed with a constant depth shallow quantum circuit
with an ancilla qubit, irrespective of the number of stabilizer generators.
Furthermore, for some simple error models, the computation results obtained
using VQED are robust against the noise that occurred during the operation of
VQED, and our method is fully compatible with other error mitigation schemes,
enabling further improvements in computation accuracy and facilitating
high-fidelity quantum computing.Comment: 10 pages, 8 figures, 1 tabl
Hunting for quantum-classical crossover in condensed matter problems
The intensive pursuit for quantum advantage in terms of computational
complexity has further led to a modernized crucial question: {\it When and how
will quantum computers outperform classical computers?} The next milestone is
undoubtedly the realization of quantum acceleration in practical problems. Here
we provide a clear evidence and arguments that the primary target is likely to
be condensed matter physics. Our primary contributions are summarized as
follows: 1) Proposal of systematic error/runtime analysis on state-of-the-art
classical algorithm based on tensor networks; 2) Dedicated and high-resolution
analysis on quantum resource performed at the level of executable logical
instructions; 3) Clarification of quantum-classical crosspoint for ground-state
simulation to be within runtime of hours using only a few hundreds of thousand
physical qubits for 2d Heisenberg and 2d Fermi-Hubbard models, assuming that
logical qubits are encoded via the surface code with the physical error rate of
. To our knowledge, we argue that condensed matter problems offer
the earliest platform for demonstration of practical quantum advantage that is
order-of-magnitude more feasible than ever known candidates, in terms of both
qubit counts and total runtime.Comment: 14+41 pages, 8+24 figure
Fine Particulate Matter and Diabetes Prevalence in Okayama, Japan
Many studies have shown an association between long-term exposure to particulate matter having an aerodynamic diameter of 2.5 μm or less (PM2.5) and diabetes mellitus (DM), but few studies have focused on Asian subjects. We thus examined the association between long-term exposure to PM2.5 and DM prevalence in Okayama City, Japan. We included 76,591 participants who had received basic health checkups in 2006 and 2007. We assigned the census-level modeled PM2.5 data from 2006 and 2007 to each participant and defined DM using treatment status and the blood testing. PM2.5 was associated with DM prevalence, and the prevalence ratio (95% confidence interval) was 1.10 (1.00-1.20) following each interquartile range increase (2.1 μg/m3) in PM2.5. This finding is consistent with previous results and suggests that long-term exposure to PM2.5 is associated with an increased prevalence of DM in Okayama City, Japan, where the PM2.5 level is lower than in other cities in Asian countries
- …