1,044,180 research outputs found
The information recovery problem
The issue of unitary evolution during creation and evaporation of a black
hole remains controversial. We~argue that some prominent cures are more
troubling than the disease, demonstrate that their central element---forming of
the event horizon before the evaporation begins---is not necessarily true, and
describe a fully coupled matter-gravity system which is manifestly unitary.Comment: 7 pages +1 fig Published versio
Models solving the problem about recovery information
Розглядаються та аналізуються моделі розв‘язання задач відновлення інформації, яка може бути отримана системами моніторингу стану об‘єктів інформаційних систем. Метою зазначеного є підвищення точності систем моніторингу. Наводяться достоїнства та недоліки моделей.Рассматриваются и анализируются модели решения задач восстановления информации, которая может быть получена системами мониторинга состояния объектов информационных систем. Целью указанного является повышение точности систем мониторинга. Приводятся достоинства и недостатки моделей.Reviewed and analyzed by the model solving recovery. Information obtained watchdog objects of information systems. The goal is to improve the accuracy of monitoring systems. Listed the advantages and disadvantages of models
Quantum Error Correction via Convex Optimization
We show that the problem of designing a quantum information error correcting
procedure can be cast as a bi-convex optimization problem, iterating between
encoding and recovery, each being a semidefinite program. For a given encoding
operator the problem is convex in the recovery operator. For a given method of
recovery, the problem is convex in the encoding scheme. This allows us to
derive new codes that are locally optimal. We present examples of such codes
that can handle errors which are too strong for codes derived by analogy to
classical error correction techniques.Comment: 16 page
Support Recovery of Sparse Signals
We consider the problem of exact support recovery of sparse signals via noisy
measurements. The main focus is the sufficient and necessary conditions on the
number of measurements for support recovery to be reliable. By drawing an
analogy between the problem of support recovery and the problem of channel
coding over the Gaussian multiple access channel, and exploiting mathematical
tools developed for the latter problem, we obtain an information theoretic
framework for analyzing the performance limits of support recovery. Sharp
sufficient and necessary conditions on the number of measurements in terms of
the signal sparsity level and the measurement noise level are derived.
Specifically, when the number of nonzero entries is held fixed, the exact
asymptotics on the number of measurements for support recovery is developed.
When the number of nonzero entries increases in certain manners, we obtain
sufficient conditions tighter than existing results. In addition, we show that
the proposed methodology can deal with a variety of models of sparse signal
recovery, hence demonstrating its potential as an effective analytical tool.Comment: 33 page
Inferring Rankings Using Constrained Sensing
We consider the problem of recovering a function over the space of
permutations (or, the symmetric group) over elements from given partial
information; the partial information we consider is related to the group
theoretic Fourier Transform of the function. This problem naturally arises in
several settings such as ranked elections, multi-object tracking, ranking
systems, and recommendation systems. Inspired by the work of Donoho and Stark
in the context of discrete-time functions, we focus on non-negative functions
with a sparse support (support size domain size). Our recovery method is
based on finding the sparsest solution (through optimization) that is
consistent with the available information. As the main result, we derive
sufficient conditions for functions that can be recovered exactly from partial
information through optimization. Under a natural random model for the
generation of functions, we quantify the recoverability conditions by deriving
bounds on the sparsity (support size) for which the function satisfies the
sufficient conditions with a high probability as .
optimization is computationally hard. Therefore, the popular compressive
sensing literature considers solving the convex relaxation,
optimization, to find the sparsest solution. However, we show that
optimization fails to recover a function (even with constant sparsity)
generated using the random model with a high probability as . In
order to overcome this problem, we propose a novel iterative algorithm for the
recovery of functions that satisfy the sufficient conditions. Finally, using an
Information Theoretic framework, we study necessary conditions for exact
recovery to be possible.Comment: 19 page
- …