437 research outputs found

    Quantum Experiments and Graphs: Multiparty States as coherent superpositions of Perfect Matchings

    Get PDF
    We show a surprising link between experimental setups to realize high-dimensional multipartite quantum states and Graph Theory. In these setups, the paths of photons are identified such that the photon-source information is never created. We find that each of these setups corresponds to an undirected graph, and every undirected graph corresponds to an experimental setup. Every term in the emerging quantum superposition corresponds to a perfect matching in the graph. Calculating the final quantum state is in the complexity class #P-complete, thus cannot be done efficiently. To strengthen the link further, theorems from Graph Theory -- such as Hall's marriage problem -- are rephrased in the language of pair creation in quantum experiments. We show explicitly how this link allows to answer questions about quantum experiments (such as which classes of entangled states can be created) with graph theoretical methods, and potentially simulate properties of Graphs and Networks with quantum experiments (such as critical exponents and phase transitions).Comment: 6+5 pages, 4+7 figure

    Matching with Trade-offs: Revealed Preferences over Competiting Characteristics

    Get PDF
    We investigate in this paper the theory and econometrics of optimal matchings with competing criteria. The surplus from a marriage match, for instance, may depend both on the incomes and on the educations of the partners, as well as on characteristics that the analyst does not observe. The social optimum must therefore trade off matching on incomes and matching on educations. Given a exible specification of the surplus function, we characterize under mild assumptions the properties of the set of feasible matchings and of the socially optimal matching. Then we show how data on the covariation of the types of the partners in observed matches can be used to estimate the parameters that define social preferences over matches. We provide both nonparametric and parametric procedures that are very easy to use in applications.matching, marriage, assignment.

    Computational and statistical tradeoffs via convex relaxation

    Get PDF
    Modern massive datasets create a fundamental problem at the intersection of the computational and statistical sciences: how to provide guarantees on the quality of statistical inference given bounds on computational resources, such as time or space. Our approach to this problem is to define a notion of “algorithmic weakening,” in which a hierarchy of algorithms is ordered by both computational efficiency and statistical efficiency, allowing the growing strength of the data at scale to be traded off against the need for sophisticated processing. We illustrate this approach in the setting of denoising problems, using convex relaxation as the core inferential tool. Hierarchies of convex relaxations have been widely used in theoretical computer science to yield tractable approximation algorithms to many computationally intractable tasks. In the current paper, we show how to endow such hierarchies with a statistical characterization and thereby obtain concrete tradeoffs relating algorithmic runtime to amount of data

    Inferring Rankings Using Constrained Sensing

    Full text link
    We consider the problem of recovering a function over the space of permutations (or, the symmetric group) over nn elements from given partial information; the partial information we consider is related to the group theoretic Fourier Transform of the function. This problem naturally arises in several settings such as ranked elections, multi-object tracking, ranking systems, and recommendation systems. Inspired by the work of Donoho and Stark in the context of discrete-time functions, we focus on non-negative functions with a sparse support (support size \ll domain size). Our recovery method is based on finding the sparsest solution (through 0\ell_0 optimization) that is consistent with the available information. As the main result, we derive sufficient conditions for functions that can be recovered exactly from partial information through 0\ell_0 optimization. Under a natural random model for the generation of functions, we quantify the recoverability conditions by deriving bounds on the sparsity (support size) for which the function satisfies the sufficient conditions with a high probability as nn \to \infty. 0\ell_0 optimization is computationally hard. Therefore, the popular compressive sensing literature considers solving the convex relaxation, 1\ell_1 optimization, to find the sparsest solution. However, we show that 1\ell_1 optimization fails to recover a function (even with constant sparsity) generated using the random model with a high probability as nn \to \infty. In order to overcome this problem, we propose a novel iterative algorithm for the recovery of functions that satisfy the sufficient conditions. Finally, using an Information Theoretic framework, we study necessary conditions for exact recovery to be possible.Comment: 19 page

    Sketching Persistence Diagrams

    Get PDF
    Given a persistence diagram with n points, we give an algorithm that produces a sequence of n persistence diagrams converging in bottleneck distance to the input diagram, the ith of which has i distinct (weighted) points and is a 2-approximation to the closest persistence diagram with that many distinct points. For each approximation, we precompute the optimal matching between the ith and the (i+1)st. Perhaps surprisingly, the entire sequence of diagrams as well as the sequence of matchings can be represented in O(n) space. The main approach is to use a variation of the greedy permutation of the persistence diagram to give good Hausdorff approximations and assign weights to these subsets. We give a new algorithm to efficiently compute this permutation, despite the high implicit dimension of points in a persistence diagram due to the effect of the diagonal. The sketches are also structured to permit fast (linear time) approximations to the Hausdorff distance between diagrams - a lower bound on the bottleneck distance. For approximating the bottleneck distance, sketches can also be used to compute a linear-size neighborhood graph directly, obviating the need for geometric data structures used in state-of-the-art methods for bottleneck computation
    corecore