1,585 research outputs found

    Estimating the Algorithmic Complexity of Stock Markets

    Full text link
    Randomness and regularities in Finance are usually treated in probabilistic terms. In this paper, we develop a completely different approach in using a non-probabilistic framework based on the algorithmic information theory initially developed by Kolmogorov (1965). We present some elements of this theory and show why it is particularly relevant to Finance, and potentially to other sub-fields of Economics as well. We develop a generic method to estimate the Kolmogorov complexity of numeric series. This approach is based on an iterative "regularity erasing procedure" implemented to use lossless compression algorithms on financial data. Examples are provided with both simulated and real-world financial time series. The contributions of this article are twofold. The first one is methodological : we show that some structural regularities, invisible with classical statistical tests, can be detected by this algorithmic method. The second one consists in illustrations on the daily Dow-Jones Index suggesting that beyond several well-known regularities, hidden structure may in this index remain to be identified

    Bayesian detection of piecewise linear trends in replicated time-series with application to growth data modelling

    Full text link
    We consider the situation where a temporal process is composed of contiguous segments with differing slopes and replicated noise-corrupted time series measurements are observed. The unknown mean of the data generating process is modelled as a piecewise linear function of time with an unknown number of change-points. We develop a Bayesian approach to infer the joint posterior distribution of the number and position of change-points as well as the unknown mean parameters. A-priori, the proposed model uses an overfitting number of mean parameters but, conditionally on a set of change-points, only a subset of them influences the likelihood. An exponentially decreasing prior distribution on the number of change-points gives rise to a posterior distribution concentrating on sparse representations of the underlying sequence. A Metropolis-Hastings Markov chain Monte Carlo (MCMC) sampler is constructed for approximating the posterior distribution. Our method is benchmarked using simulated data and is applied to uncover differences in the dynamics of fungal growth from imaging time course data collected from different strains. The source code is available on CRAN.Comment: Accepted to International Journal of Biostatistic

    Status and Future Perspectives for Lattice Gauge Theory Calculations to the Exascale and Beyond

    Full text link
    In this and a set of companion whitepapers, the USQCD Collaboration lays out a program of science and computing for lattice gauge theory. These whitepapers describe how calculation using lattice QCD (and other gauge theories) can aid the interpretation of ongoing and upcoming experiments in particle and nuclear physics, as well as inspire new ones.Comment: 44 pages. 1 of USQCD whitepapers

    MCMC methods: graph samplers, invariance tests and epidemic models

    Get PDF
    Markov Chain Monte Carlo (MCMC) techniques are used ubiquitously for simulation-based inference. This thesis provides novel contributions to MCMC methods and their application to graph sampling and epidemic modeling. The first topic considered is that of sampling graphs conditional on a set of prescribed statistics, which is a difficult problem arising naturally in many fields: sociology (Holland and Leinhardt, 1981), psychology (Connor and Simberloff, 1979), categorical data analysis (Agresti, 1992) and finance (Squartini et al., 2018, Gandy and Veraart, 2019) being examples. Bespoke MCMC samplers are proposed for this setting. The second major topic addressed is that of modeling the dynamics of infectious diseases, where MCMC is leveraged as the general inference engine. The first part of this thesis addresses important problems such as the uniform sampling of graphs with given degree sequences, and weighted graphs with given strength sequences. These distributions are frequently used for exact tests on social networks and two-way contingency tables. Another application is quantifying the statistical significance of patterns observed in real networks. This is crucial for understanding whether such patterns indicate the presence of interesting network phenomena, or whether they simply result from less interesting processes, such as nodal-heterogeneity. The MCMC samplers developed in the course of this research are complex, and there is great scope for conceptual, analytic, and implementation errors. This motivates a chapter that develops novel tests for detecting errors in MCMC implementations. The tests introduced are unique in being exact, which allows us to keep the false rejection probability arbitrarily low. Rather than develop bespoke samplers, as in the first part of the thesis, the second part leverages a standard MCMC framework Stan (Stan Development Team, 2018) as the workhorse for fitting state-of-the-art epidemic models. We present a general framework for semi-mechanistic Bayesian modeling of infectious diseases using renewal processes. The term semi-mechanistic relates to statistical estimation within some constrained mechanism. This research was motivated by the ongoing SARS-COV-2 pandemic, and variants of the model have been used in specific analyses of Covid-19. We present epidemia, an R package allowing researchers to leverage the epidemic models. A key goal of this work is to demonstrate that MCMC, and in particular, Stan’s No-U-Turn (Hoffman and Gelman, 2014) sampler, can be routinely employed to fit a large-class of epidemic models. A second goal is to make the models accessible to the general research community, through epidemia.Open Acces

    Novel models and algorithms for systems reliability modeling and optimization

    Get PDF
    Recent growth in the scale and complexity of products and technologies in the defense and other industries is challenging product development, realization, and sustainment costs. Uncontrolled costs and routine budget overruns are causing all parties involved to seek lean product development processes and treatment of reliability, availability, and maintainability of the system as a true design parameter . To this effect, accurate estimation and management of the system reliability of a design during the earliest stages of new product development is not only critical for managing product development and manufacturing costs but also to control life cycle costs (LCC). In this regard, the overall objective of this research study is to develop an integrated framework for design for reliability (DFR) during upfront product development by treating reliability as a design parameter. The aim here is to develop the theory, methods, and tools necessary for: 1) accurate assessment of system reliability and availability and 2) optimization of the design to meet system reliability targets. In modeling the system reliability and availability, we aim to address the limitations of existing methods, in particular the Markov chains method and the Dynamic Bayesian Network approach, by incorporating a Continuous Time Bayesian Network framework for more effective modeling of sub-system/component interactions, dependencies, and various repair policies. We also propose a multi-object optimization scheme to aid the designer in obtaining optimal design(s) with respect to system reliability/availability targets and other system design requirements. In particular, the optimization scheme would entail optimal selection of sub-system and component alternatives. The theory, methods, and tools to be developed will be extensively tested and validated using simulation test-bed data and actual case studies from our industry partners
    corecore