39,235 research outputs found

    Monitoring Count Time Series in R: Aberration Detection in Public Health Surveillance

    Full text link
    Public health surveillance aims at lessening disease burden, e.g., in case of infectious diseases by timely recognizing emerging outbreaks. Seen from a statistical perspective, this implies the use of appropriate methods for monitoring time series of aggregated case reports. This paper presents the tools for such automatic aberration detection offered by the R package surveillance. We introduce the functionality for the visualization, modelling and monitoring of surveillance time series. With respect to modelling we focus on univariate time series modelling based on generalized linear models (GLMs), multivariate GLMs, generalized additive models and generalized additive models for location, shape and scale. This ranges from illustrating implementational improvements and extensions of the well-known Farrington algorithm, e.g, by spline-modelling or by treating it in a Bayesian context. Furthermore, we look at categorical time series and address overdispersion using beta-binomial or Dirichlet-Multinomial modelling. With respect to monitoring we consider detectors based on either a Shewhart-like single timepoint comparison between the observed count and the predictive distribution or by likelihood-ratio based cumulative sum methods. Finally, we illustrate how surveillance can support aberration detection in practice by integrating it into the monitoring workflow of a public health institution. Altogether, the present article shows how well surveillance can support automatic aberration detection in a public health surveillance context

    A Survey Of Activity Recognition And Understanding The Behavior In Video Survelliance

    Full text link
    This paper presents a review of human activity recognition and behaviour understanding in video sequence. The key objective of this paper is to provide a general review on the overall process of a surveillance system used in the current trend. Visual surveillance system is directed on automatic identification of events of interest, especially on tracking and classification of moving objects. The processing step of the video surveillance system includes the following stages: Surrounding model, object representation, object tracking, activity recognition and behaviour understanding. It describes techniques that use to define a general set of activities that are applicable to a wide range of scenes and environments in video sequence.Comment: 14 pages, 5 figures, 5 table

    A first look at the performances of a Bayesian chart to monitor the ratio of two Weibull percentiles

    Full text link
    The aim of the present work is to investigate the performances of a specific Bayesian control chart used to compare two processes. The chart monitors the ratio of the percentiles of a key characteristic associated with the processes. The variability of such a characteristic is modeled via the Weibull distribution and a practical Bayesian approach to deal with Weibull data is adopted. The percentiles of the two monitored processes are assumed to be independent random variables. The Weibull distributions of the key characteristic of both processes are assumed to have the same and stable shape parameter. This is usually experienced in practice because the Weibull shape parameter is related to the main involved factor of variability. However, if a change of the shape parameters of the processes is suspected, the involved distributions can be used to monitor their stability. We first tested the effects of the number of the training data on the responsiveness of the chart. Then we tested the robustness of the chart in spite of very poor prior information. To this end, the prior values were changed to reflect a 50% shift in both directions from the original values of the shape parameter and the percentiles of the two monitored processes. Finally, various combinations of shifts were considered for the sampling distributions after the Phase I, with the purpose of estimating the diagnostic ability of the charts to signal an out-of-control state. The traditional approach based on the Average Run Length, empirically computed via a Monte Carlo simulation, was adopted.Comment: 9 pages, 3 figures, 3 tables. Invited talk at the 4th International Symposium on Statistical Process Monitoring (http://isspm2015.stat.unipd.it), July 7-9, 2015, Padua, Ital

    Signal-based Bayesian Seismic Monitoring

    Full text link
    Detecting weak seismic events from noisy sensors is a difficult perceptual task. We formulate this task as Bayesian inference and propose a generative model of seismic events and signals across a network of spatially distributed stations. Our system, SIGVISA, is the first to directly model seismic waveforms, allowing it to incorporate a rich representation of the physics underlying the signal generation process. We use Gaussian processes over wavelet parameters to predict detailed waveform fluctuations based on historical events, while degrading smoothly to simple parametric envelopes in regions with no historical seismicity. Evaluating on data from the western US, we recover three times as many events as previous work, and reduce mean location errors by a factor of four while greatly increasing sensitivity to low-magnitude events.Comment: Appearing at AISTATS 201

    Estimating Multiple Step Shifts in a Gaussian Process Mean with an Application to Phase I Control Chart Analysis

    Full text link
    In preliminary analysis of control charts, one may encounter multiple shifts and/or outliers especially with a large number of observations. The following paper addresses this problem. A statistical model for detecting and estimating multiple change points in a finite batch of retrospective (phase I)data is proposed based on likelihood ratio test. We consider a univariate normal distribution with multiple step shifts occurred in predefined locations of process mean. A numerical example is performed to illustrate the efficiency of our method. Finally, performance comparisons, based on accuracy measures and precision measures, are explored through simulation studies.Comment: 5 pages, to be submitted in IEEE CASE 201

    Large Multistream Data Analytics for Monitoring and Diagnostics in Manufacturing Systems

    Full text link
    The high-dimensionality and volume of large scale multistream data has inhibited significant research progress in developing an integrated monitoring and diagnostics (M&D) approach. This data, also categorized as big data, is becoming common in manufacturing plants. In this paper, we propose an integrated M\&D approach for large scale streaming data. We developed a novel monitoring method named Adaptive Principal Component monitoring (APC) which adaptively chooses PCs that are most likely to vary due to the change for early detection. Importantly, we integrate a novel diagnostic approach, Principal Component Signal Recovery (PCSR), to enable a streamlined SPC. This diagnostics approach draws inspiration from Compressed Sensing and uses Adaptive Lasso for identifying the sparse change in the process. We theoretically motivate our approaches and do a performance evaluation of our integrated M&D method through simulations and case studies

    High Dimensional Process Monitoring Using Robust Sparse Probabilistic Principal Component Analysis

    Full text link
    High dimensional data has introduced challenges that are difficult to address when attempting to implement classical approaches of statistical process control. This has made it a topic of interest for research due in recent years. However, in many cases, data sets have underlying structures, such as in advanced manufacturing systems. If extracted correctly, efficient methods for process control can be developed. This paper proposes a robust sparse dimensionality reduction approach for correlated high-dimensional process monitoring to address the aforementioned issues. The developed monitoring technique uses robust sparse probabilistic PCA to reduce the dimensionality of the data stream while retaining interpretability. The proposed methodology utilizes Bayesian variational inference to obtain the estimates of a probabilistic representation of PCA. Simulation studies were conducted to verify the efficacy of the proposed methodology. Furthermore, we conducted a case study for change detection for in-line Raman spectroscopy to validate the efficiency of our proposed method in a practical scenario

    Detection and Prediction of Cardiac Anomalies Using Wireless Body Sensors and Bayesian Belief Networks

    Full text link
    Intricating cardiac complexities are the primary factor associated with healthcare costs and the highest cause of death rate in the world. However, preventive measures like the early detection of cardiac anomalies can prevent severe cardiovascular arrests of varying complexities and can impose a substantial impact on healthcare cost. Encountering such scenarios usually the electrocardiogram (ECG or EKG) is the first diagnostic choice of a medical practitioner or clinical staff to measure the electrical and muscular fitness of an individual heart. This paper presents a system which is capable of reading the recorded ECG and predict the cardiac anomalies without the intervention of a human expert. The paper purpose an algorithm which read and perform analysis on electrocardiogram datasets. The proposed architecture uses the Discrete Wavelet Transform (DWT) at first place to perform preprocessing of ECG data followed by undecimated Wavelet transform (UWT) to extract nine relevant features which are of high interest to a cardiologist. The probabilistic mode named Bayesian Network Classifier is trained using the extracted nine parameters on UCL arrhythmia dataset. The proposed system classifies a recorded heartbeat into four classes using Bayesian Network classifier and Tukey's box analysis. The four classes for the prediction of a heartbeat are (a) Normal Beat, (b) Premature Ventricular Contraction (PVC) (c) Premature Atrial Contraction (PAC) and (d) Myocardial Infarction. The results of experimental setup depict that the proposed system has achieved an average accuracy of 96.6 for PAC\% 92.8\% for MI and 87\% for PVC, with an average error rate of 3.3\% for PAC, 6\% for MI and 12.5\% for PVC on real electrocardiogram datasets including Physionet and European ST-T Database (EDB)

    A note on monitoring ratios of two Weibull percentiles

    Full text link
    This note introduces a new Bayesian control chart to compare two processes by monitoring the ratio of their percentiles under Weibull assumption. Both in-control and out-of-control parameters are supposed unknown. The chart analyses the sampling data directly, instead of transforming them in order to comply with the usual normality assumption, as most charts do. The chart uses the whole accumulated knowledge, resulting from the current and all the past samples, to monitor the current value of the ratio. Two real applications in the wood industry and in the concrete industry give a first picture of the features of the chart.Comment: 13 pages; 4 figures; 3 table

    Distributed Machine Learning in Materials that Couple Sensing, Actuation, Computation and Communication

    Full text link
    This paper reviews machine learning applications and approaches to detection, classification and control of intelligent materials and structures with embedded distributed computation elements. The purpose of this survey is to identify desired tasks to be performed in each type of material or structure (e.g., damage detection in composites), identify and compare common approaches to learning such tasks, and investigate models and training paradigms used. Machine learning approaches and common temporal features used in the domains of structural health monitoring, morphable aircraft, wearable computing and robotic skins are explored. As the ultimate goal of this research is to incorporate the approaches described in this survey into a robotic material paradigm, the potential for adapting the computational models used in these applications, and corresponding training algorithms, to an amorphous network of computing nodes is considered. Distributed versions of support vector machines, graphical models and mixture models developed in the field of wireless sensor networks are reviewed. Potential areas of investigation, including possible architectures for incorporating machine learning into robotic nodes, training approaches, and the possibility of using deep learning approaches for automatic feature extraction, are discussed
    • …
    corecore