927,341 research outputs found

    Multiple Change-point Detection: a Selective Overview

    Full text link
    Very long and noisy sequence data arise from biological sciences to social science including high throughput data in genomics and stock prices in econometrics. Often such data are collected in order to identify and understand shifts in trend, e.g., from a bull market to a bear market in finance or from a normal number of chromosome copies to an excessive number of chromosome copies in genetics. Thus, identifying multiple change points in a long, possibly very long, sequence is an important problem. In this article, we review both classical and new multiple change-point detection strategies. Considering the long history and the extensive literature on the change-point detection, we provide an in-depth discussion on a normal mean change-point model from aspects of regression analysis, hypothesis testing, consistency and inference. In particular, we present a strategy to gather and aggregate local information for change-point detection that has become the cornerstone of several emerging methods because of its attractiveness in both computational and theoretical properties.Comment: 26 pages, 2 figure

    Bayesian Methods in Brain Connectivity Change Point Detection with EEG Data and Genetic Algorithm

    Get PDF
    Human brain is processing a great amount of information everyday, and our brain regions are organized optimally for this information processing. There have been increasing number of studies focusing on functional or effective connectivity in human brain regions in the last decade. In this dissertation, Bayesian methods in Brain connectivity change point detection are discussed. First, a review of state-of-the-art Bayesian-inference-based methods applied to functional magnetic resonance imaging (fMRI) data is carried out, three methods are reviewed and compared. Second, the Bayesian connectivity change point model is extended to change point analysis in electroencephalogram (EEG) data, and the ability of EEG measures of frontal and temporo-parietal activity during mindfulness therapy to track response to dysfunctional anxiety patients\u27 treatment is tested successfully. Then an optimized method for Bayesian connectivity change point model with genetic algorithm (GA) is proposed and proved to be more efficient in change point detection. And due to the good parallel performance of GA, the change point detection method can be parallelized in GPU or multi-processor computers as a future work. Furthermore, a more advanced Bayesian bi-cluster connectivity change point model is developed to simultaneously detect change point of each subject within a group, and cluster subjects into different groups according to their change point distribution and connectivity dynamics. The method is also validated on experimental datasets. After discussing brain change point detection, a review of Bayesian analysis of complex mutations in HBV HCV and HIV studies is also included as part of my Ph.D. work. Finally, conclusions are drawn and future work is discussed

    La Détection des changements tridimensionnels à l'aide de nuages de points : Une revue

    Full text link
    peer reviewedChange detection is an important step for the characterization of object dynamics at the earth’s surface. In multi-temporal point clouds, the main challenge is to detect true changes at different granularities in a scene subject to significant noise and occlusion. To better understand new research perspectives in this field, a deep review of recent advances in 3D change detection methods is needed. To this end, we present a comprehensive review of the state of the art of 3D change detection approaches, mainly those using 3D point clouds. We review standard methods and recent advances in the use of machine and deep learning for change detection. In addition, the paper presents a summary of 3D point cloud benchmark datasets from different sensors (aerial, mobile, and static), together with associated information. We also investigate representative evaluation metrics for this task. To finish, we present open questions and research perspectives. By reviewing the relevant papers in the field, we highlight the potential of bi- and multi-temporal point clouds for better monitoring analysis for various applications.11. Sustainable cities and communitie

    Stochastic methods for measurement-based network control

    Get PDF
    The main task of network administrators is to ensure that their network functions properly. Whether they manage a telecommunication or a road network, they generally base their decisions on the analysis of measurement data. Inspired by such network control applications, this dissertation investigates several stochastic modelling techniques for data analysis. The focus is on two areas within the field of stochastic processes: change point detection and queueing theory. Part I deals with statistical methods for the automatic detection of change points, being changes in the probability distribution underlying a data sequence. This part starts with a review of existing change point detection methods for data sequences consisting of independent observations. The main contribution of this part is the generalisation of the classic cusum method to account for dependence within data sequences. We analyse the false alarm probability of the resulting methods using a large deviations approach. The part also discusses numerical tests of the new methods and a cyber attack detection application, in which we investigate how to detect dns tunnels. The main contribution of Part II is the application of queueing models (probabilistic models for waiting lines) to situations in which the system to be controlled can only be observed partially. We consider two types of partial information. Firstly, we develop a procedure to get insight into the performance of queueing systems between consecutive system-state measurements and apply it in a numerical study, which was motivated by capacity management in cable access networks. Secondly, inspired by dynamic road control applications, we study routing policies in a queueing system for which just part of the jobs are observable and controllable

    Geomorphic Change Detection Using Multi-Beam Sonar

    Get PDF
    The emergence of multi-beam echo sounders (MBES) as an applicable surveying technology in shallow water environments has expanded the extent of geomorphic change detection studies to include river environments that historically have not been possible to survey or only small portions have been surveyed. The high point densities and accuracy of MBES has the potential to create highly accurate digital elevation models (DEM). However, to properly use MBES data for DEM creation and subsequent analysis, it is essential to quantify and propagate uncertainty in surveyed points and surfaces derived from them through each phase of data collection and processing. Much attention has been given to the topic of spatially variable uncertainty propagation in the context of the construction of DEM and their use in geomorphic change detection studies. However little work has been done specifically with applying spatially varying uncertainty models for MBES data in shallow water environments. To address this need, this report presents a review of literature and methodology of uncertainty quantification in a geomorphic change detection study. These methods are then applied and analyzed in a geomorphic change detection study using MBES as the data collection technique

    A review of the use of terrestrial laser scanning application for change detection and deformation monitoring of structures

    Get PDF
    Change detection and deformation monitoring is an active area of research within the field of engineering surveying as well as overlapping areas such as structural and civil engineering. The application of Terrestrial Laser Scanning (TLS) techniques for change detection and deformation monitoring of concrete structures has increased over the years as illustrated in the past studies. This paper presents a review of literature on TLS application in the monitoring of structures and discusses registration and georeferencing of TLS point cloud data as a critical issue in the process chain of accurate deformation analysis. Past TLS research work has shown some trends in addressing issues such as accurate registration and georeferencing of the scans and the need of a stable reference frame, TLS error modelling and reduction, point cloud processing techniques for deformation analysis, scanner calibration issues and assessing the potential of TLS in detecting sub-centimetre and millimetre deformations. However, several issues are still open to investigation as far as TLS is concerned in change detection and deformation monitoring studies such as rigorous and efficient workflow methodology of point cloud processing for change detection and deformation analysis, incorporation of measurement geometry in deformation measurements of high-rise structures, design of data acquisition and quality assessment for precise measurements and modelling the environmental effects on the performance of laser scanning. Even though some studies have attempted to address these issues, some gaps exist as information is still limited. Some methods reviewed in the case studies have been applied in landslide monitoring and they seem promising to be applied in engineering surveying to monitor structures. Hence the proposal of a three-stage process model for deformation analysis is presented. Furthermore, with technological advancements new TLS instruments with better accuracy are being developed necessitating more research for precise measurements in the monitoring of structures

    Identifying change point in production time-series volatility using control charts and stochastic differential equations

    Get PDF
    The article focuses on volatility change point detection using SPC (Statistical Process Control) methods, specifically time-series control charts and stochastic differential equations (SDEs). Contribution will review recent advances in change point detection for the volatility component of a process satisfying stochastic differential equation (SDE) based on discrete observations, and also by using time-series control charts. Theoretical part will discuss methodology of time-series control charts and SDEs driven by a Brownian motion. Research part will demonstrate the methodologies in a simulation study focusing on analysis of the AR(1) process by means of time-series control charts and SDEs. The aim is to make use of change point detection in time series of production processes and highlight versatility of control charts not only in manufacturing but also in managing financial cash flow stability. © 2014, World Scientific and Engineering Academy and Society. All rights reserved

    Recent Advances Toward Transparent Methane Emissions Monitoring: A Review

    Get PDF
    Given that anthropogenic greenhouse gas (GHG) emissions must be immediately reduced to avoid drastic increases in global temperature, methane emissions have been placed center stage in the fight against climate change. Methane has a significantly larger warming potential than carbon dioxide. A large percentage of methane emissions are in the form of industry emissions, some of which can now be readily identified and mitigated. This review considers recent advances in methane detection that allow accurate and transparent monitoring, which are needed for reducing uncertainty in source attribution and evaluating progress in emissions reductions. A particular focus is on complementary methods operating at different scales with applications for the oil and gas industry, allowing rapid detection of large point sources and addressing inconsistencies of emissions inventories. Emerging airborne and satellite imaging spectrometers are advancing our understanding and offer new top-down assessment methods to complement bottom-up methods. Successfully merging estimates across scales is vital for increased certainty regarding greenhouse gas emissions and can inform regulatory decisions. The development of comprehensive, transparent, and spatially resolved top-down and bottom-up inventories will be crucial for holding nations accountable for their climate commitments

    Ranking Volatility in Building Energy Consumption Using Ensemble Learning and Information Entropy

    Get PDF
    Given the rise in building energy consumption and demand worldwide, energy inefficiency detection has become extremely important. A significant portion of the energy used in commercial buildings is wasted as a result of poor maintenance, degradation or improperly controlled equipment. Most facilities employ sensors to track energy consumption across multiple buildings. Smart fault detection and diagnostic systems use various anomaly detection techniques to discover point anomalies in consumption. While these systems work reasonably well in detecting equipment anomalies over short-term intervals, further exploration is needed in finding methods that consider long-term consumption to detect anomalous buildings. This paper presents a novel approach for a multi-building campus to rank and visualize the long-term volatility of building consumption. This allows for the optimal allocation of limited time and resources for the detection and resolution of energy waste. The proposed method first classifies daily consumption into 5 classes using an ensemble learner and then calculates the information entropy on the resulting classification set to determine volatility. The ensemble learner receives input from a K-Nearest Neighbor classifier, a Random Forest classifier and an Artificial Neural Network. In general, buildings are expected to keep the same energy profile over time, all else being equal. Buildings that frequently change energy profiles are ranked and flagged by the system for review, which would call for the next step to reduce waste and costs and to increase the sustainability of buildings. Data on energy consumption for 132 buildings is obtained from energy management at the Georgia Institute of Technology. Experimental results show the effectiveness of the proposed approach

    Diagnosing growth in low-grade gliomas with and without longitudinal volume measurements: A retrospective observational study.

    Get PDF
    BACKGROUND: Low-grade gliomas cause significant neurological morbidity by brain invasion. There is no universally accepted objective technique available for detection of enlargement of low-grade gliomas in the clinical setting; subjective evaluation by clinicians using visual comparison of longitudinal radiological studies is the gold standard. The aim of this study is to determine whether a computer-assisted diagnosis (CAD) method helps physicians detect earlier growth of low-grade gliomas. METHODS AND FINDINGS: We reviewed 165 patients diagnosed with grade 2 gliomas, seen at the University of Alabama at Birmingham clinics from 1 July 2017 to 14 May 2018. MRI scans were collected during the spring and summer of 2018. Fifty-six gliomas met the inclusion criteria, including 19 oligodendrogliomas, 26 astrocytomas, and 11 mixed gliomas in 30 males and 26 females with a mean age of 48 years and a range of follow-up of 150.2 months (difference between highest and lowest values). None received radiation therapy. We also studied 7 patients with an imaging abnormality without pathological diagnosis, who were clinically stable at the time of retrospective review (14 May 2018). This study compared growth detection by 7 physicians aided by the CAD method with retrospective clinical reports. The tumors of 63 patients (56 + 7) in 627 MRI scans were digitized, including 34 grade 2 gliomas with radiological progression and 22 radiologically stable grade 2 gliomas. The CAD method consisted of tumor segmentation, computing volumes, and pointing to growth by the online abrupt change-of-point method, which considers only past measurements. Independent scientists have evaluated the segmentation method. In 29 of the 34 patients with progression, the median time to growth detection was only 14 months for CAD compared to 44 months for current standard of care radiological evaluation (p \u3c 0.001). Using CAD, accurate detection of tumor enlargement was possible with a median of only 57% change in the tumor volume as compared to a median of 174% change of volume necessary to diagnose tumor growth using standard of care clinical methods (p \u3c 0.001). In the radiologically stable group, CAD facilitated growth detection in 13 out of 22 patients. CAD did not detect growth in the imaging abnormality group. The main limitation of this study was its retrospective design; nevertheless, the results depict the current state of a gold standard in clinical practice that allowed a significant increase in tumor volumes from baseline before detection. Such large increases in tumor volume would not be permitted in a prospective design. The number of glioma patients (n = 56) is a limitation; however, it is equivalent to the number of patients in phase II clinical trials. CONCLUSIONS: The current practice of visual comparison of longitudinal MRI scans is associated with significant delays in detecting growth of low-grade gliomas. Our findings support the idea that physicians aided by CAD detect growth at significantly smaller volumes than physicians using visual comparison alone. This study does not answer the questions whether to treat or not and which treatment modality is optimal. Nonetheless, early growth detection sets the stage for future clinical studies that address these questions and whether early therapeutic interventions prolong survival and improve quality of life
    corecore