6,563 research outputs found
Telecommunications Division
Spacecraft telecommunication systems - coding methods for phase locked loops, absolute time determination by pulsar, communications elements researc
The Feynman problem and Fermionic entanglement: Fermionic theory versus qubit theory
The present paper is both a review on the Feynman problem, and an original
research presentation on the relations between Fermionic theories and qubits
theories, both regarded in the novel framework of operational probabilistic
theories. The most relevant results about the Feynman problem of simulating
Fermions with qubits are reviewed, and in the light of the new original results
the problem is solved. The answer is twofold. On the computational side the two
theories are equivalent, as shown by Bravyi and Kitaev (Ann. Phys. 298.1
(2002): 210-226). On the operational side the quantum theory of qubits and the
quantum theory of Fermions are different, mostly in the notion of locality,
with striking consequences on entanglement. Thus the emulation does not respect
locality, as it was suspected by Feynman (Int. J. Theor. Phys. 21.6 (1982):
467-488).Comment: 46 pages, review about the "Feynman problem". Fixed many typo
Synthesis and Optimization of Reversible Circuits - A Survey
Reversible logic circuits have been historically motivated by theoretical
research in low-power electronics as well as practical improvement of
bit-manipulation transforms in cryptography and computer graphics. Recently,
reversible circuits have attracted interest as components of quantum
algorithms, as well as in photonic and nano-computing technologies where some
switching devices offer no signal gain. Research in generating reversible logic
distinguishes between circuit synthesis, post-synthesis optimization, and
technology mapping. In this survey, we review algorithmic paradigms ---
search-based, cycle-based, transformation-based, and BDD-based --- as well as
specific algorithms for reversible synthesis, both exact and heuristic. We
conclude the survey by outlining key open challenges in synthesis of reversible
and quantum logic, as well as most common misconceptions.Comment: 34 pages, 15 figures, 2 table
Formal Verification of Probabilistic SystemC Models with Statistical Model Checking
Transaction-level modeling with SystemC has been very successful in
describing the behavior of embedded systems by providing high-level executable
models, in which many of them have inherent probabilistic behaviors, e.g.,
random data and unreliable components. It thus is crucial to have both
quantitative and qualitative analysis of the probabilities of system
properties. Such analysis can be conducted by constructing a formal model of
the system under verification and using Probabilistic Model Checking (PMC).
However, this method is infeasible for large systems, due to the state space
explosion. In this article, we demonstrate the successful use of Statistical
Model Checking (SMC) to carry out such analysis directly from large SystemC
models and allow designers to express a wide range of useful properties. The
first contribution of this work is a framework to verify properties expressed
in Bounded Linear Temporal Logic (BLTL) for SystemC models with both timed and
probabilistic characteristics. Second, the framework allows users to expose a
rich set of user-code primitives as atomic propositions in BLTL. Moreover,
users can define their own fine-grained time resolution rather than the
boundary of clock cycles in the SystemC simulation. The third contribution is
an implementation of a statistical model checker. It contains an automatic
monitor generation for producing execution traces of the
model-under-verification (MUV), the mechanism for automatically instrumenting
the MUV, and the interaction with statistical model checking algorithms.Comment: Journal of Software: Evolution and Process. Wiley, 2017. arXiv admin
note: substantial text overlap with arXiv:1507.0818
The price of certainty: "waterslide curves" and the gap to capacity
The classical problem of reliable point-to-point digital communication is to
achieve a low probability of error while keeping the rate high and the total
power consumption small. Traditional information-theoretic analysis uses
`waterfall' curves to convey the revolutionary idea that unboundedly low
probabilities of bit-error are attainable using only finite transmit power.
However, practitioners have long observed that the decoder complexity, and
hence the total power consumption, goes up when attempting to use sophisticated
codes that operate close to the waterfall curve.
This paper gives an explicit model for power consumption at an idealized
decoder that allows for extreme parallelism in implementation. The decoder
architecture is in the spirit of message passing and iterative decoding for
sparse-graph codes. Generalized sphere-packing arguments are used to derive
lower bounds on the decoding power needed for any possible code given only the
gap from the Shannon limit and the desired probability of error. As the gap
goes to zero, the energy per bit spent in decoding is shown to go to infinity.
This suggests that to optimize total power, the transmitter should operate at a
power that is strictly above the minimum demanded by the Shannon capacity.
The lower bound is plotted to show an unavoidable tradeoff between the
average bit-error probability and the total power used in transmission and
decoding. In the spirit of conventional waterfall curves, we call these
`waterslide' curves.Comment: 37 pages, 13 figures. Submitted to IEEE Transactions on Information
Theory. This version corrects a subtle bug in the proofs of the original
submission and improves the bounds significantl
Fault-Tolerant Measurement-Based Quantum Computing with Continuous-Variable Cluster States
A long-standing open question about Gaussian continuous-variable cluster
states is whether they enable fault-tolerant measurement-based quantum
computation. The answer is yes. Initial squeezing in the cluster above a
threshold value of 20.5 dB ensures that errors from finite squeezing acting on
encoded qubits are below the fault-tolerance threshold of known qubit-based
error-correcting codes. By concatenating with one of these codes and using
ancilla-based error correction, fault-tolerant measurement-based quantum
computation of theoretically indefinite length is possible with finitely
squeezed cluster states.Comment: (v3) consistent with published version, more accessible for general
audience; (v2) condensed presentation, added references on GKP state
generation and a comparison of currently achievable squeezing to the
threshold; (v1) 13 pages, a few figure
- …