46 research outputs found

    RMCMC: A System for Updating Bayesian Models

    Get PDF
    A system to update estimates from a sequence of probability distributions is presented. The aim of the system is to quickly produce estimates with a user-specified bound on the Monte Carlo error. The estimates are based upon weighted samples stored in a database. The stored samples are maintained such that the accuracy of the estimates and quality of the samples is satisfactory. This maintenance involves varying the number of samples in the database and updating their weights. New samples are generated, when required, by a Markov chain Monte Carlo algorithm. The system is demonstrated using a football league model that is used to predict the end of season table. Correctness of the estimates and their accuracy is shown in a simulation using a linear Gaussian model

    The chopthin algorithm for resampling

    Full text link
    Resampling is a standard step in particle filters and more generally sequential Monte Carlo methods. We present an algorithm, called chopthin, for resampling weighted particles. In contrast to standard resampling methods the algorithm does not produce a set of equally weighted particles; instead it merely enforces an upper bound on the ratio between the weights. Simulation studies show that the chopthin algorithm consistently outperforms standard resampling methods. The algorithms chops up particles with large weight and thins out particles with low weight, hence its name. It implicitly guarantees a lower bound on the effective sample size. The algorithm can be implemented efficiently, making it practically useful. We show that the expected computational effort is linear in the number of particles. Implementations for C++, R (on CRAN), Python and Matlab are available.Comment: 14 pages, 4 figure

    Plateau Proposal Distributions for Adaptive Component-wise Multiple-Try Metropolis

    Get PDF
    Markov chain Monte Carlo (MCMC) methods are sampling methods that have become a commonly used tool in statistics, for example to perform Monte Carlo integration. As a consequence of the increase in computational power, many variations of MCMC methods exist for generating samples from arbitrary, possibly complex, target distributions. The performance of an MCMC method is predominately governed by the choice of the so-called proposal distribution used. In this paper, we introduce a new type of proposal distribution for the use in MCMC methods that operates component-wise and with multiple trials per iteration. Specifically, the novel class of proposal distributions, called Plateau distributions, do not overlap, thus ensuring that the multiple trials are drawn from different regions of the state space. Furthermore, the Plateau proposal distributions allow for a bespoke adaptation procedure that lends itself to a Markov chain with efficient problem dependent state space exploration and improved burn-in properties. Simulation studies show that our novel MCMC algorithm outperforms competitors when sampling from distributions with a complex shape, highly correlated components or multiple modes.Comment: 24 pages, 12 figure

    Fibre optic sensing of ageing railway infrastructure enhanced with statistical shape analysis

    Get PDF
    Funder: Aston UniversityAbstract: Developing early-warning sensor-based maintenance systems for ageing railway infrastructure, such as masonry arch bridges, can be a challenging task due to the difficulty of identifying degradation/damage as the source of small, gradual changes in sensor data, as opposed to other environmental and loading effects. This paper offers a new method of applying statistical modelling and machine learning to enhance the interpretation of fibre optic sensing data, and, therefore, improve deterioration monitoring of railway infrastructure. Dynamic strain and temperature monitoring data between 2016 and 2019 from a fibre Bragg grating (FBG) network installed in a Victorian railway viaduct are first presented. The statistical shape analysis adopted in this study is modified to track changes in the shape of FBG signals directly linked to train speed and dynamic strain amplitudes. The method is complemented by a support vector machine, which is trained to identify different classes of trains. After distinguishing train types, dynamic strain was found to be clearly correlated to temperature, verifying previous findings. No correlation with train speed was observed. The integrated system is then able to compensate for changes in the structural performance due to variations in train loading and ambient temperature, and identify changes in dynamic deformation caused by degradation, in an order comparable to the signal noise (± 2 micro-strain). As a result, the new procedure is shown to be capable of detecting small magnitudes of local degradation well before this degradation manifests itself in typical global measures of response

    Real-time statistical modelling of data generated from self-sensing bridges

    Get PDF
    Instrumentation of infrastructure is changing the way engineers design, construct, monitor and maintain structures such as roads, bridges and underground structures. Data gathered from these instruments have changed the hands-on assessment of infrastructure behaviour to include data processing and statistical analysis procedures. Engineers wish to understand the behaviour of the infrastructure and detect changes – for example, degradation – but are now using high-frequency data acquired from a sensor network. Presented in this paper is a case study that models and analyses in real time the dynamic strain data gathered from a railway bridge which has been instrumented with fibre-optic sensor networks. The high frequency of the data combined with the large number of sensors requires methods that efficiently analyse the data. First, automated methods are developed to extract train passage events from the background signal and underlying trends due to environmental effects. Second, a streaming statistical model which can be updated efficiently is introduced that predicts strain measurements forward in time. This tool is enhanced to provide anomaly detection capabilities in individual sensors and the entire sensor network. These methods allow for the practical processing and analysis of large data sets. The implementation of these contributions will be essential for demonstrating the value of self-sensing structures. </jats:p

    An exploration of secondary students' mental states when learning about acids and bases

    Get PDF
    This study explored factors of students’ mental states, including emotion, intention, internal mental representation, and external mental representation, which can affect their learning performance. In evaluating students’ mental states during the science learning process and the relationship between mental states and learning achievement, valid, reliable, and scalable measures of students’ mental states and learning achievement are needed. This paper presents the development of the Mental State Conceptual Learning Inventory (MSCLI) to identify students’ mental states before and after learning about acids and bases. This instrument is time efficient and convenient and can be administered to large student samples so that teachers and researchers can gain profound insights into their students’ learning of acids and bases in science class. The results of this study indicate that students’ mental states are highly correlated with their achievement. As a whole, low-achieving students tended to have negative emotions and low intentions, were not good at internal visualization, and were unable to interpret graphics and draw pictures. In contrast, high-achieving students had positive emotions and intentions when learning life-related topics about acids and bases, and were good at internal visualization and drawing and interpreting graphics
    corecore