1,755 research outputs found

    Predictability, complexity and learning

    Full text link
    We define {\em predictive information} Ipred(T)I_{\rm pred} (T) as the mutual information between the past and the future of a time series. Three qualitatively different behaviors are found in the limit of large observation times TT: Ipred(T)I_{\rm pred} (T) can remain finite, grow logarithmically, or grow as a fractional power law. If the time series allows us to learn a model with a finite number of parameters, then Ipred(T)I_{\rm pred} (T) grows logarithmically with a coefficient that counts the dimensionality of the model space. In contrast, power--law growth is associated, for example, with the learning of infinite parameter (or nonparametric) models such as continuous functions with smoothness constraints. There are connections between the predictive information and measures of complexity that have been defined both in learning theory and in the analysis of physical systems through statistical mechanics and dynamical systems theory. Further, in the same way that entropy provides the unique measure of available information consistent with some simple and plausible conditions, we argue that the divergent part of Ipred(T)I_{\rm pred} (T) provides the unique measure for the complexity of dynamics underlying a time series. Finally, we discuss how these ideas may be useful in different problems in physics, statistics, and biology.Comment: 53 pages, 3 figures, 98 references, LaTeX2

    The Shape of Learning Curves: a Review

    Full text link
    Learning curves provide insight into the dependence of a learner's generalization performance on the training set size. This important tool can be used for model selection, to predict the effect of more training data, and to reduce the computational complexity of model training and hyperparameter tuning. This review recounts the origins of the term, provides a formal definition of the learning curve, and briefly covers basics such as its estimation. Our main contribution is a comprehensive overview of the literature regarding the shape of learning curves. We discuss empirical and theoretical evidence that supports well-behaved curves that often have the shape of a power law or an exponential. We consider the learning curves of Gaussian processes, the complex shapes they can display, and the factors influencing them. We draw specific attention to examples of learning curves that are ill-behaved, showing worse learning performance with more training data. To wrap up, we point out various open problems that warrant deeper empirical and theoretical investigation. All in all, our review underscores that learning curves are surprisingly diverse and no universal model can be identified

    Understanding and Controlling Regime Switching in Molecular Diffusion

    Full text link
    Diffusion can be strongly affected by ballistic flights (long jumps) as well as long-lived sticking trajectories (long sticks). Using statistical inference techniques in the spirit of Granger causality, we investigate the appearance of long jumps and sticks in molecular-dynamics simulations of diffusion in a prototype system, a benzene molecule on a graphite substrate. We find that specific fluctuations in certain, but not all, internal degrees of freedom of the molecule can be linked to either long jumps or sticks. Furthermore, by changing the prevalence of these predictors with an outside influence, the diffusion of the molecule can be controlled. The approach presented in this proof of concept study is very generic, and can be applied to larger and more complex molecules. Additionally, the predictor variables can be chosen in a general way so as to be accessible in experiments, making the method feasible for control of diffusion in applications. Our results also demonstrate that data-mining techniques can be used to investigate the phase-space structure of high-dimensional nonlinear dynamical systems.Comment: accepted for publication by PR
    • …
    corecore