663 research outputs found

    Relating Description Complexity to Entropy

    Get PDF
    We demonstrate some novel links between entropy and description complexity, a notion referring to the minimal formula length for specifying given properties. Let MLU be the logic obtained by extending propositional logic with the universal modality, and let GMLU be the corresponding extension with the ability to count. In the finite, MLU is expressively complete for specifying sets of variable assignments, while GMLU is expressively complete for multisets. We show that for MLU, the model classes with maximal Boltzmann entropy are the ones with maximal description complexity. Concerning GMLU, we show that expected Boltzmann entropy is asymptotically equivalent to expected description complexity multiplied by the number of proposition symbols considered. To contrast these results, we prove that this link breaks when we move to considering first-order logic FO over vocabularies with higher-arity relations. To establish the aforementioned result, we show that almost all finite models require relatively large FO-formulas to define them. Our results relate to links between Kolmogorov complexity and entropy, demonstrating a way to conceive such results in the logic-based scenario where relational structures are classified by formulas of different sizes

    Completion of Matrices with Low Description Complexity

    Full text link
    We propose a theory for matrix completion that goes beyond the low-rank structure commonly considered in the literature and applies to general matrices of low description complexity. Specifically, complexity of the sets of matrices encompassed by the theory is measured in terms of Hausdorff and upper Minkowski dimensions. Our goal is the characterization of the number of linear measurements, with an emphasis on rank-11 measurements, needed for the existence of an algorithm that yields reconstruction, either perfect, with probability 1, or with arbitrarily small probability of error, depending on the setup. Concretely, we show that matrices taken from a set U\mathcal{U} such that U−U\mathcal{U}-\mathcal{U} has Hausdorff dimension ss can be recovered from k>sk>s measurements, and random matrices supported on a set U\mathcal{U} of Hausdorff dimension ss can be recovered with probability 1 from k>sk>s measurements. What is more, we establish the existence of recovery mappings that are robust against additive perturbations or noise in the measurements. Concretely, we show that there are β\beta-H\"older continuous mappings recovering matrices taken from a set of upper Minkowski dimension ss from k>2s/(1−β)k>2s/(1-\beta) measurements and, with arbitrarily small probability of error, random matrices supported on a set of upper Minkowski dimension ss from k>s/(1−β)k>s/(1-\beta) measurements. The numerous concrete examples we consider include low-rank matrices, sparse matrices, QR decompositions with sparse R-components, and matrices of fractal nature

    Minimum description complexity

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2002.Includes bibliographical references (p. 136-140).The classical problem of model selection among parametric model sets is considered. The goal is to choose a model set which best represents observed data. The critical task is the choice of a criterion for model set comparison. Pioneer information theoretic based approaches to this problem are Akaike information criterion (AIC) and different forms of minimum description length (MDL). The prior assumption in these methods is that the unknown true model is a member of all the competing sets. We introduce a new method of model selection: minimum description complexity (MDC). The approach is motivated by the Kullback-Leibler information distance. The method suggests choosing the model set for which the model set relative entropy is minimum. We provide a probabilistic method of MDC estimation for a class of parametric model sets. In this calculation the key factor is our prior assumption: unlike the existing methods, no assumption of the true model being a member of the competing model sets is needed. The main strength of the MDC calculation is in its method of extracting information from the observed data.(cont.) Interesting results exhibit the advantages of MDC over MDL and AIC both theoretically and practically. It is illustrated that, under particular conditions, AIC is a special case of MDC. Application of MDC in system identification and signal denoising is investigated. The proposed method answers the challenging question of quality evaluation in identification of stable LTI systems under a fair prior assumption on the unmodeled dynamics. MDC also provides a new solution to a class of denoising problems. We elaborate the theoretical superiority of MDC over the existing thresholding denoising methods.by Soosan Beheshti.Ph.D

    Weak 1/r-Nets for Moving Points

    Get PDF
    In this paper, we extend the weak 1/r-net theorem to a kinetic setting where the underlying set of points is moving polynomially with bounded description complexity. We establish that one can find a kinetic analog N of a weak 1/r-net of cardinality O(r^(d(d+1)/2)log^d r) whose points are moving with coordinates that are rational functions with bounded description complexity. Moreover, each member of N has one polynomial coordinate

    Combinatorial complexity in o-minimal geometry

    Full text link
    In this paper we prove tight bounds on the combinatorial and topological complexity of sets defined in terms of nn definable sets belonging to some fixed definable family of sets in an o-minimal structure. This generalizes the combinatorial parts of similar bounds known in the case of semi-algebraic and semi-Pfaffian sets, and as a result vastly increases the applicability of results on combinatorial and topological complexity of arrangements studied in discrete and computational geometry. As a sample application, we extend a Ramsey-type theorem due to Alon et al., originally proved for semi-algebraic sets of fixed description complexity to this more general setting.Comment: 25 pages. Revised version. To appear in the Proc. London Math. So

    Around Kolmogorov complexity: basic notions and results

    Full text link
    Algorithmic information theory studies description complexity and randomness and is now a well known field of theoretical computer science and mathematical logic. There are several textbooks and monographs devoted to this theory where one can find the detailed exposition of many difficult results as well as historical references. However, it seems that a short survey of its basic notions and main results relating these notions to each other, is missing. This report attempts to fill this gap and covers the basic notions of algorithmic information theory: Kolmogorov complexity (plain, conditional, prefix), Solomonoff universal a priori probability, notions of randomness (Martin-L\"of randomness, Mises--Church randomness), effective Hausdorff dimension. We prove their basic properties (symmetry of information, connection between a priori probability and prefix complexity, criterion of randomness in terms of complexity, complexity characterization for effective dimension) and show some applications (incompressibility method in computational complexity theory, incompleteness theorems). It is based on the lecture notes of a course at Uppsala University given by the author
    • …
    corecore