36,192 research outputs found

    Infinite time decidable equivalence relation theory

    Full text link
    We introduce an analog of the theory of Borel equivalence relations in which we study equivalence relations that are decidable by an infinite time Turing machine. The Borel reductions are replaced by the more general class of infinite time computable functions. Many basic aspects of the classical theory remain intact, with the added bonus that it becomes sensible to study some special equivalence relations whose complexity is beyond Borel or even analytic. We also introduce an infinite time generalization of the countable Borel equivalence relations, a key subclass of the Borel equivalence relations, and again show that several key properties carry over to the larger class. Lastly, we collect together several results from the literature regarding Borel reducibility which apply also to absolutely Delta_1^2 reductions, and hence to the infinite time computable reductions.Comment: 30 pages, 3 figure

    Towards Machine Wald

    Get PDF
    The past century has seen a steady increase in the need of estimating and predicting complex systems and making (possibly critical) decisions with limited information. Although computers have made possible the numerical evaluation of sophisticated statistical models, these models are still designed \emph{by humans} because there is currently no known recipe or algorithm for dividing the design of a statistical model into a sequence of arithmetic operations. Indeed enabling computers to \emph{think} as \emph{humans} have the ability to do when faced with uncertainty is challenging in several major ways: (1) Finding optimal statistical models remains to be formulated as a well posed problem when information on the system of interest is incomplete and comes in the form of a complex combination of sample data, partial knowledge of constitutive relations and a limited description of the distribution of input random variables. (2) The space of admissible scenarios along with the space of relevant information, assumptions, and/or beliefs, tend to be infinite dimensional, whereas calculus on a computer is necessarily discrete and finite. With this purpose, this paper explores the foundations of a rigorous framework for the scientific computation of optimal statistical estimators/models and reviews their connections with Decision Theory, Machine Learning, Bayesian Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty Quantification and Information Based Complexity.Comment: 37 page

    Time and spectral domain relative entropy: A new approach to multivariate spectral estimation

    Full text link
    The concept of spectral relative entropy rate is introduced for jointly stationary Gaussian processes. Using classical information-theoretic results, we establish a remarkable connection between time and spectral domain relative entropy rates. This naturally leads to a new spectral estimation technique where a multivariate version of the Itakura-Saito distance is employed}. It may be viewed as an extension of the approach, called THREE, introduced by Byrnes, Georgiou and Lindquist in 2000 which, in turn, followed in the footsteps of the Burg-Jaynes Maximum Entropy Method. Spectral estimation is here recast in the form of a constrained spectrum approximation problem where the distance is equal to the processes relative entropy rate. The corresponding solution entails a complexity upper bound which improves on the one so far available in the multichannel framework. Indeed, it is equal to the one featured by THREE in the scalar case. The solution is computed via a globally convergent matricial Newton-type algorithm. Simulations suggest the effectiveness of the new technique in tackling multivariate spectral estimation tasks, especially in the case of short data records.Comment: 32 pages, submitted for publicatio

    Computational Complexity of the Interleaving Distance

    Full text link
    The interleaving distance is arguably the most prominent distance measure in topological data analysis. In this paper, we provide bounds on the computational complexity of determining the interleaving distance in several settings. We show that the interleaving distance is NP-hard to compute for persistence modules valued in the category of vector spaces. In the specific setting of multidimensional persistent homology we show that the problem is at least as hard as a matrix invertibility problem. Furthermore, this allows us to conclude that the interleaving distance of interval decomposable modules depends on the characteristic of the field. Persistence modules valued in the category of sets are also studied. As a corollary, we obtain that the isomorphism problem for Reeb graphs is graph isomorphism complete.Comment: Discussion related to the characteristic of the field added. Paper accepted to the 34th International Symposium on Computational Geometr
    corecore