23,075 research outputs found

    Compound sequential change-point detection in parallel data streams

    Get PDF
    We consider sequential change-point detection in parallel data streams, where each stream has its own change point. Once a change is detected in a data stream, this stream is deactivated permanently. The goal is to maximize the normal operation of the pre-change streams, while controlling the proportion of post-change streams among the active streams at all time points. Taking a Bayesian formulation, we develop a compound decision framework for this problem. A procedure is proposed that is uniformly optimal among all sequential procedures which control the expected proportion of post-change streams at all time points. We also investigate the asymptotic behavior of the proposed method when the number of data streams grows large. Numerical examples are provided to illustrate the use and performance of the proposed method

    Reaaliaikainen käännepisteiden havainta hylkäysvirheaste- ja kommunikaatiorajoitteilla

    Get PDF
    In a quickest detection problem, the objective is to detect abrupt changes in a stochastic sequence as quickly as possible, while limiting rate of false alarms. The development of algorithms that after each observation decide to either stop and declare a change as having happened, or to continue the monitoring process has been an active line of research in mathematical statistics. The algorithms seek to optimally balance the inherent trade-off between the average detection delay in declaring a change and the likelihood of declaring a change prematurely. Change-point detection methods have applications in numerous domains, including monitoring the environment or the radio spectrum, target detection, financial markets, and others. Classical quickest detection theory focuses settings where only a single data stream is observed. In modern day applications facilitated by development of sensing technology, one may be tasked with monitoring multiple streams of data for changes simultaneously. Wireless sensor networks or mobile phones are examples of technology where devices can sense their local environment and transmit data in a sequential manner to some common fusion center (FC) or cloud for inference. When performing quickest detection tasks on multiple data streams in parallel, classical tools of quickest detection theory focusing on false alarm probability control may become insufficient. Instead, controlling the false discovery rate (FDR) has recently been proposed as a more useful and scalable error criterion. The FDR is the expected proportion of false discoveries (false alarms) among all discoveries. In this thesis, novel methods and theory related to quickest detection in multiple parallel data streams are presented. The methods aim to minimize detection delay while controlling the FDR. In addition, scenarios where not all of the devices communicating with the FC can remain operational and transmitting to the FC at all times are considered. The FC must choose which subset of data streams it wants to receive observations from at a given time instant. Intelligently choosing which devices to turn on and off may extend the devices’ battery life, which can be important in real-life applications, while affecting the detection performance only slightly. The performance of the proposed methods is demonstrated in numerical simulations to be superior to existing approaches. Additionally, the topic of multiple hypothesis testing in spatial domains is briefly addressed. In a multiple hypothesis testing problem, one tests multiple null hypotheses at once while trying to control a suitable error criterion, such as the FDR. In a spatial multiple hypothesis problem each tested hypothesis corresponds to e.g. a geographical location, and the non-null hypotheses may appear in spatially localized clusters. It is demonstrated that implementing a Bayesian approach that accounts for the spatial dependency between the hypotheses can greatly improve testing accuracy

    Bayesian Quickest Detection of Propagating Spatial Events

    Full text link
    Rapid detection of spatial events that propagate across a sensor network is of wide interest in many modern applications. In particular, in communications, radar, environmental monitoring, and biosurveillance, we may observe propagating fields or particles. In this paper, we propose Bayesian single and multiple change-point detection procedures for the rapid detection of propagating spatial events. It is assumed that the spatial event propagates across a network of sensors according to the physical properties of the source causing the event. The multi-sensor system configuration is arbitrary and sensors may be mobile. We begin by considering a single spatial event and are interested in detecting this event as quickly as possible, while controlling the probability of false alarm. Using a dynamic programming framework we derive the structure of the optimal procedure, which minimizes the average detection delay (ADD) subject to a false alarm probability upper bound. In the rare event regime, the optimal procedure converges to a more practical threshold test on the posterior probability of the change point. A convenient recursive computation of this posterior probability is derived by using the propagation pattern of the spatial event. The ADD of the posterior probability threshold test is analyzed in the asymptotic regime, and specific analysis is conducted in the setting of detecting attenuating random signals. Then, we show how the proposed procedure is easy to extend for detecting multiple propagating spatial events in parallel. A method that provides false discovery rate (FDR) control is proposed. In the simulation section, it is clearly demonstrated that exploiting the spatial properties of the event decreases the ADD compared to procedures that do not utilize this information, even under model mismatch.Comment: 14 pages, 5 figure

    Mining a Small Medical Data Set by Integrating the Decision Tree and t-test

    Get PDF
    [[abstract]]Although several researchers have used statistical methods to prove that aspiration followed by the injection of 95% ethanol left in situ (retention) is an effective treatment for ovarian endometriomas, very few discuss the different conditions that could generate different recovery rates for the patients. Therefore, this study adopts the statistical method and decision tree techniques together to analyze the postoperative status of ovarian endometriosis patients under different conditions. Since our collected data set is small, containing only 212 records, we use all of these data as the training data. Therefore, instead of using a resultant tree to generate rules directly, we use the value of each node as a cut point to generate all possible rules from the tree first. Then, using t-test, we verify the rules to discover some useful description rules after all possible rules from the tree have been generated. Experimental results show that our approach can find some new interesting knowledge about recurrent ovarian endometriomas under different conditions.[[journaltype]]國外[[incitationindex]]EI[[booktype]]紙本[[countrycodes]]FI
    • …
    corecore