1,276 research outputs found

    Characterization theorem for the conditionally computable real functions

    Full text link
    The class of uniformly computable real functions with respect to a small subrecursive class of operators computes the elementary functions of calculus, restricted to compact subsets of their domains. The class of conditionally computable real functions with respect to the same class of operators is a proper extension of the class of uniformly computable real functions and it computes the elementary functions of calculus on their whole domains. The definition of both classes relies on certain transformations of infinitistic names of real numbers. In the present paper, the conditional computability of real functions is characterized in the spirit of Tent and Ziegler, avoiding the use of infinitistic names

    Computable de Finetti measures

    Full text link
    We prove a computable version of de Finetti's theorem on exchangeable sequences of real random variables. As a consequence, exchangeable stochastic processes expressed in probabilistic functional programming languages can be automatically rewritten as procedures that do not modify non-local state. Along the way, we prove that a distribution on the unit interval is computable if and only if its moments are uniformly computable.Comment: 32 pages. Final journal version; expanded somewhat, with minor corrections. To appear in Annals of Pure and Applied Logic. Extended abstract appeared in Proceedings of CiE '09, LNCS 5635, pp. 218-23

    Hierarchical Models as Marginals of Hierarchical Models

    Full text link
    We investigate the representation of hierarchical models in terms of marginals of other hierarchical models with smaller interactions. We focus on binary variables and marginals of pairwise interaction models whose hidden variables are conditionally independent given the visible variables. In this case the problem is equivalent to the representation of linear subspaces of polynomials by feedforward neural networks with soft-plus computational units. We show that every hidden variable can freely model multiple interactions among the visible variables, which allows us to generalize and improve previous results. In particular, we show that a restricted Boltzmann machine with less than [2(log(v)+1)/(v+1)]2v1[ 2(\log(v)+1) / (v+1) ] 2^v-1 hidden binary variables can approximate every distribution of vv visible binary variables arbitrarily well, compared to 2v112^{v-1}-1 from the best previously known result.Comment: 18 pages, 4 figures, 2 tables, WUPES'1

    Algorithmic statistics: forty years later

    Full text link
    Algorithmic statistics has two different (and almost orthogonal) motivations. From the philosophical point of view, it tries to formalize how the statistics works and why some statistical models are better than others. After this notion of a "good model" is introduced, a natural question arises: it is possible that for some piece of data there is no good model? If yes, how often these bad ("non-stochastic") data appear "in real life"? Another, more technical motivation comes from algorithmic information theory. In this theory a notion of complexity of a finite object (=amount of information in this object) is introduced; it assigns to every object some number, called its algorithmic complexity (or Kolmogorov complexity). Algorithmic statistic provides a more fine-grained classification: for each finite object some curve is defined that characterizes its behavior. It turns out that several different definitions give (approximately) the same curve. In this survey we try to provide an exposition of the main results in the field (including full proofs for the most important ones), as well as some historical comments. We assume that the reader is familiar with the main notions of algorithmic information (Kolmogorov complexity) theory.Comment: Missing proofs adde
    corecore