8,010 research outputs found
Low dimensional manifolds for exact representation of open quantum systems
Weakly nonlinear degrees of freedom in dissipative quantum systems tend to
localize near manifolds of quasi-classical states. We present a family of
analytical and computational methods for deriving optimal unitary model
transformations based on representations of finite dimensional Lie groups. The
transformations are optimal in that they minimize the quantum relative entropy
distance between a given state and the quasi-classical manifold. This naturally
splits the description of quantum states into quasi-classical coordinates that
specify the nearest quasi-classical state and a transformed quantum state that
can be represented in fewer basis levels. We derive coupled equations of motion
for the coordinates and the transformed state and demonstrate how this can be
exploited for efficient numerical simulation. Our optimization objective
naturally quantifies the non-classicality of states occurring in some given
open system dynamics. This allows us to compare the intrinsic complexity of
different open quantum systems.Comment: Added section on semi-classical SR-latch, added summary of method,
revised structure of manuscrip
Simple approach to approximate quantum error correction based on the transpose channel
We demonstrate that there exists a universal, near-optimal recovery map—the transpose channel—for approximate quantum error-correcting codes, where optimality is defined using the worst-case fidelity. Using the transpose channel, we provide an alternative interpretation of the standard quantum error correction (QEC) conditions and generalize them to a set of conditions for approximate QEC (AQEC) codes. This forms the basis of a simple algorithm for finding AQEC codes. Our analytical approach is a departure from earlier work relying on exhaustive numerical search for the optimal recovery map, with optimality defined based on entanglement fidelity. For the practically useful case of codes encoding a single qubit of information, our algorithm is particularly easy to implement
Task adapted reconstruction for inverse problems
The paper considers the problem of performing a task defined on a model
parameter that is only observed indirectly through noisy data in an ill-posed
inverse problem. A key aspect is to formalize the steps of reconstruction and
task as appropriate estimators (non-randomized decision rules) in statistical
estimation problems. The implementation makes use of (deep) neural networks to
provide a differentiable parametrization of the family of estimators for both
steps. These networks are combined and jointly trained against suitable
supervised training data in order to minimize a joint differentiable loss
function, resulting in an end-to-end task adapted reconstruction method. The
suggested framework is generic, yet adaptable, with a plug-and-play structure
for adjusting both the inverse problem and the task at hand. More precisely,
the data model (forward operator and statistical model of the noise) associated
with the inverse problem is exchangeable, e.g., by using neural network
architecture given by a learned iterative method. Furthermore, any task that is
encodable as a trainable neural network can be used. The approach is
demonstrated on joint tomographic image reconstruction, classification and
joint tomographic image reconstruction segmentation
Best-fit quasi-equilibrium ensembles: a general approach to statistical closure of underresolved Hamiltonian dynamics
A new method of deriving reduced models of Hamiltonian dynamical systems is
developed using techniques from optimization and statistical estimation. Given
a set of resolved variables that define a model reduction, the
quasi-equilibrium ensembles associated with the resolved variables are employed
as a family of trial probability densities on phase space. The residual that
results from submitting these trial densities to the Liouville equation is
quantified by an ensemble-averaged cost function related to the information
loss rate of the reduction. From an initial nonequilibrium state, the
statistical state of the system at any later time is estimated by minimizing
the time integral of the cost function over paths of trial densities.
Statistical closure of the underresolved dynamics is obtained at the level of
the value function, which equals the optimal cost of reduction with respect to
the resolved variables, and the evolution of the estimated statistical state is
deduced from the Hamilton-Jacobi equation satisfied by the value function. In
the near-equilibrium regime, or under a local quadratic approximation in the
far-from-equilibrium regime, this best-fit closure is governed by a
differential equation for the estimated state vector coupled to a Riccati
differential equation for the Hessian matrix of the value function. Since
memory effects are not explicitly included in the trial densities, a single
adjustable parameter is introduced into the cost function to capture a
time-scale ratio between resolved and unresolved motions. Apart from this
parameter, the closed equations for the resolved variables are completely
determined by the underlying deterministic dynamics
- …