474 research outputs found

    Bayesian definition of random sequences with respect to conditional probabilities

    Full text link
    We study Martin-L\"{o}f random (ML-random) points on computable probability measures on sample and parameter spaces (Bayes models). We consider four variants of conditional random sequences with respect to the conditional distributions: two of them are defined by ML-randomness on Bayes models and the others are defined by blind tests for conditional distributions. We consider a weak criterion for conditional ML-randomness and show that only variants of ML-randomness on Bayes models satisfy the criterion. We show that these four variants of conditional randomness are identical when the conditional probability measure is computable and the posterior distribution converges weakly to almost all parameters. We compare ML-randomness on Bayes models with randomness for uniformly computable parametric models. It is known that two computable probability measures are orthogonal if and only if their ML-random sets are disjoint. We extend these results for uniformly computable parametric models. Finally, we present an algorithmic solution to a classical problem in Bayes statistics, i.e.~the posterior distributions converge weakly to almost all parameters if and only if the posterior distributions converge weakly to all ML-random parameters.Comment: revised versio

    Computational limits to nonparametric estimation for ergodic processes

    Full text link
    A new negative result for nonparametric estimation of binary ergodic processes is shown. I The problem of estimation of distribution with any degree of accuracy is studied. Then it is shown that for any countable class of estimators there is a zero-entropy binary ergodic process that is inconsistent with the class of estimators. Our result is different from other negative results for universal forecasting scheme of ergodic processes.Comment: submitted to IEEE trans I

    Algorithmic randomness and monotone complexity on product space

    Full text link
    We study algorithmic randomness and monotone complexity on product of the set of infinite binary sequences. We explore the following problems: monotone complexity on product space, Lambalgen's theorem for correlated probability, classification of random sets by likelihood ratio tests, decomposition of complexity and independence, Bayesian statistics for individual random sequences. Formerly Lambalgen's theorem for correlated probability is shown under a uniform computability assumption in [H. Takahashi Inform. Comp. 2008]. In this paper we show the theorem without the assumption

    Algorithmic randomness and stochastic selection function

    Full text link
    We show algorithmic randomness versions of the two classical theorems on subsequences of normal numbers. One is Kamae-Weiss theorem (Kamae 1973) on normal numbers, which characterize the selection function that preserves normal numbers. Another one is the Steinhaus (1922) theorem on normal numbers, which characterize the normality from their subsequences. In van Lambalgen (1987), an algorithmic analogy to Kamae-Weiss theorem is conjectured in terms of algorithmic randomness and complexity. In this paper we consider two types of algorithmic random sequence; one is ML-random sequences and the other one is the set of sequences that have maximal complexity rate. Then we show algorithmic randomness versions of corresponding theorems to the above classical results.Comment: submitted to CCR2012 special issue. arXiv admin note: text overlap with arXiv:1106.315

    Some explicit formulae for the distributions of words (Probability Symposium)

    Get PDF
    Parts of the paper have been presented in [23, 24]
    corecore