6,160 research outputs found

    Entropy Concentration and the Empirical Coding Game

    Full text link
    We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two `strong entropy concentration' theorems. These theorems unify and generalize Jaynes' `concentration phenomenon' and Van Campenhout and Cover's `conditional limit theorem'. The theorems characterize exactly in what sense a prior distribution Q conditioned on a given constraint, and the distribution P, minimizing the relative entropy D(P ||Q) over all distributions satisfying the constraint, are `close' to each other. We then apply our theorems to establish the relationship between entropy concentration and a game-theoretic characterization of Maximum Entropy Inference due to Topsoe and others.Comment: A somewhat modified version of this paper was published in Statistica Neerlandica 62(3), pages 374-392, 200

    Predictability, complexity and learning

    Full text link
    We define {\em predictive information} Ipred(T)I_{\rm pred} (T) as the mutual information between the past and the future of a time series. Three qualitatively different behaviors are found in the limit of large observation times TT: Ipred(T)I_{\rm pred} (T) can remain finite, grow logarithmically, or grow as a fractional power law. If the time series allows us to learn a model with a finite number of parameters, then Ipred(T)I_{\rm pred} (T) grows logarithmically with a coefficient that counts the dimensionality of the model space. In contrast, power--law growth is associated, for example, with the learning of infinite parameter (or nonparametric) models such as continuous functions with smoothness constraints. There are connections between the predictive information and measures of complexity that have been defined both in learning theory and in the analysis of physical systems through statistical mechanics and dynamical systems theory. Further, in the same way that entropy provides the unique measure of available information consistent with some simple and plausible conditions, we argue that the divergent part of Ipred(T)I_{\rm pred} (T) provides the unique measure for the complexity of dynamics underlying a time series. Finally, we discuss how these ideas may be useful in different problems in physics, statistics, and biology.Comment: 53 pages, 3 figures, 98 references, LaTeX2

    Exploring flow occurrence in elite golf

    Get PDF
    Research on flow (Csikszentmihalyi, 1975) has traditionally focused on reactive, externally-paced sports (e.g., tennis) without exploring those that are self-paced and stop-start in nature. This study investigated the occurrence of flow in a sample of thirteen elite golfers by conducting semi-structured interviews discussing: (i) their experiences of flow, (ii) factors that influenced flow occurrence, and (iii) the controllability of these experiences. Results shared similarity with existing research in terms of the majority of influencing factors reported, including motivation, preparation, focus, psychological state, environmental and situational conditions, and arousal, and that flow was reported to be at least potentially controllable. Golf-specific influences were also noted, including pre-shot routines, use of psychological interventions, standard of performance, and maintenance of physical state, suggesting that flow may have occurred differently for this sample. Findings are discussed and applied recommendations are made that may help golfers put relevant factors in place to increase the likelihood of experiencing flow

    Randomized Quantization and Source Coding with Constrained Output Distribution

    Full text link
    This paper studies fixed-rate randomized vector quantization under the constraint that the quantizer's output has a given fixed probability distribution. A general representation of randomized quantizers that includes the common models in the literature is introduced via appropriate mixtures of joint probability measures on the product of the source and reproduction alphabets. Using this representation and results from optimal transport theory, the existence of an optimal (minimum distortion) randomized quantizer having a given output distribution is shown under various conditions. For sources with densities and the mean square distortion measure, it is shown that this optimum can be attained by randomizing quantizers having convex codecells. For stationary and memoryless source and output distributions a rate-distortion theorem is proved, providing a single-letter expression for the optimum distortion in the limit of large block-lengths.Comment: To appear in the IEEE Transactions on Information Theor
    • …
    corecore