1,044,180 research outputs found

    The information recovery problem

    Full text link
    The issue of unitary evolution during creation and evaporation of a black hole remains controversial. We~argue that some prominent cures are more troubling than the disease, demonstrate that their central element---forming of the event horizon before the evaporation begins---is not necessarily true, and describe a fully coupled matter-gravity system which is manifestly unitary.Comment: 7 pages +1 fig Published versio

    Models solving the problem about recovery information

    Get PDF
    Розглядаються та аналізуються моделі розв‘язання задач відновлення інформації, яка може бути отримана системами моніторингу стану об‘єктів інформаційних систем. Метою зазначеного є підвищення точності систем моніторингу. Наводяться достоїнства та недоліки моделей.Рассматриваются и анализируются модели решения задач восстановления информации, которая может быть получена системами мониторинга состояния объектов информационных систем. Целью указанного является повышение точности систем мониторинга. Приводятся достоинства и недостатки моделей.Reviewed and analyzed by the model solving recovery. Information obtained watchdog objects of information systems. The goal is to improve the accuracy of monitoring systems. Listed the advantages and disadvantages of models

    Quantum Error Correction via Convex Optimization

    Get PDF
    We show that the problem of designing a quantum information error correcting procedure can be cast as a bi-convex optimization problem, iterating between encoding and recovery, each being a semidefinite program. For a given encoding operator the problem is convex in the recovery operator. For a given method of recovery, the problem is convex in the encoding scheme. This allows us to derive new codes that are locally optimal. We present examples of such codes that can handle errors which are too strong for codes derived by analogy to classical error correction techniques.Comment: 16 page

    Support Recovery of Sparse Signals

    Full text link
    We consider the problem of exact support recovery of sparse signals via noisy measurements. The main focus is the sufficient and necessary conditions on the number of measurements for support recovery to be reliable. By drawing an analogy between the problem of support recovery and the problem of channel coding over the Gaussian multiple access channel, and exploiting mathematical tools developed for the latter problem, we obtain an information theoretic framework for analyzing the performance limits of support recovery. Sharp sufficient and necessary conditions on the number of measurements in terms of the signal sparsity level and the measurement noise level are derived. Specifically, when the number of nonzero entries is held fixed, the exact asymptotics on the number of measurements for support recovery is developed. When the number of nonzero entries increases in certain manners, we obtain sufficient conditions tighter than existing results. In addition, we show that the proposed methodology can deal with a variety of models of sparse signal recovery, hence demonstrating its potential as an effective analytical tool.Comment: 33 page

    Inferring Rankings Using Constrained Sensing

    Full text link
    We consider the problem of recovering a function over the space of permutations (or, the symmetric group) over nn elements from given partial information; the partial information we consider is related to the group theoretic Fourier Transform of the function. This problem naturally arises in several settings such as ranked elections, multi-object tracking, ranking systems, and recommendation systems. Inspired by the work of Donoho and Stark in the context of discrete-time functions, we focus on non-negative functions with a sparse support (support size \ll domain size). Our recovery method is based on finding the sparsest solution (through 0\ell_0 optimization) that is consistent with the available information. As the main result, we derive sufficient conditions for functions that can be recovered exactly from partial information through 0\ell_0 optimization. Under a natural random model for the generation of functions, we quantify the recoverability conditions by deriving bounds on the sparsity (support size) for which the function satisfies the sufficient conditions with a high probability as nn \to \infty. 0\ell_0 optimization is computationally hard. Therefore, the popular compressive sensing literature considers solving the convex relaxation, 1\ell_1 optimization, to find the sparsest solution. However, we show that 1\ell_1 optimization fails to recover a function (even with constant sparsity) generated using the random model with a high probability as nn \to \infty. In order to overcome this problem, we propose a novel iterative algorithm for the recovery of functions that satisfy the sufficient conditions. Finally, using an Information Theoretic framework, we study necessary conditions for exact recovery to be possible.Comment: 19 page
    corecore