385 research outputs found

    Error-Tolerant Exact Query Learning of Finite Set Partitions with Same-Cluster Oracle

    Full text link
    This paper initiates the study of active learning for exact recovery of partitions exclusively through access to a same-cluster oracle in the presence of bounded adversarial error. We first highlight a novel connection between learning partitions and correlation clustering. Then we use this connection to build a R\'enyi-Ulam style analytical framework for this problem, and prove upper and lower bounds on its worst-case query complexity. Further, we bound the expected performance of a relevant randomized algorithm. Finally, we study the relationship between adaptivity and query complexity for this problem and related variants.Comment: 28 pages, 2 figure

    Metric mean dimension and analog compression

    Full text link
    Wu and Verd\'u developed a theory of almost lossless analog compression, where one imposes various regularity conditions on the compressor and the decompressor with the input signal being modelled by a (typically infinite-entropy) stationary stochastic process. In this work we consider all stationary stochastic processes with trajectories in a prescribed set of (bi-)infinite sequences and find uniform lower and upper bounds for certain compression rates in terms of metric mean dimension and mean box dimension. An essential tool is the recent Lindenstrauss-Tsukamoto variational principle expressing metric mean dimension in terms of rate-distortion functions. We obtain also lower bounds on compression rates for a fixed stationary process in terms of the rate-distortion dimension rates and study several examples.Comment: v3: Accepted for publication in IEEE Transactions on Information Theory. Additional examples were added. Material have been reorganized (with some parts removed). Minor mistakes were correcte

    Learning Graph Parameters from Linear Measurements: Fundamental Trade-offs and Application to Electric Grids

    Get PDF
    We consider a specific graph learning task: reconstructing a symmetric matrix that represents an underlying graph using linear measurements. We study fundamental trade-offs between the number of measurements (sample complexity), the complexity of the graph class, and the probability of error by first deriving a necessary condition (fundamental limit) on the number of measurements. Then, by considering a two-stage recovery scheme, we give a sufficient condition for recovery. In the special cases of the uniform distribution on trees with n nodes and the Erdös-Rényi (n, p) class, the sample complexity derived from the fundamental trade-offs is tight up to multiplicative factors. In addition, we design and implement a polynomial-time (in n) algorithm based on the two-stage recovery scheme. Simulations for several canonical graph classes and IEEE power system test cases demonstrate the effectiveness of the proposed algorithm for accurate topology and parameter recovery

    Learning Graph Parameters from Linear Measurements: Fundamental Trade-offs and Application to Electric Grids

    Get PDF
    We consider a specific graph learning task: reconstructing a symmetric matrix that represents an underlying graph using linear measurements. We study fundamental trade-offs between the number of measurements (sample complexity), the complexity of the graph class, and the probability of error by first deriving a necessary condition (fundamental limit) on the number of measurements. Then, by considering a two-stage recovery scheme, we give a sufficient condition for recovery. In the special cases of the uniform distribution on trees with n nodes and the Erdös-Rényi (n, p) class, the sample complexity derived from the fundamental trade-offs is tight up to multiplicative factors. In addition, we design and implement a polynomial-time (in n) algorithm based on the two-stage recovery scheme. Simulations for several canonical graph classes and IEEE power system test cases demonstrate the effectiveness of the proposed algorithm for accurate topology and parameter recovery
    corecore