85 research outputs found

    High-Rate Vector Quantization for the Neyman-Pearson Detection of Correlated Processes

    Full text link
    This paper investigates the effect of quantization on the performance of the Neyman-Pearson test. It is assumed that a sensing unit observes samples of a correlated stationary ergodic multivariate process. Each sample is passed through an N-point quantizer and transmitted to a decision device which performs a binary hypothesis test. For any false alarm level, it is shown that the miss probability of the Neyman-Pearson test converges to zero exponentially as the number of samples tends to infinity, assuming that the observed process satisfies certain mixing conditions. The main contribution of this paper is to provide a compact closed-form expression of the error exponent in the high-rate regime i.e., when the number N of quantization levels tends to infinity, generalizing previous results of Gupta and Hero to the case of non-independent observations. If d represents the dimension of one sample, it is proved that the error exponent converges at rate N^{2/d} to the one obtained in the absence of quantization. As an application, relevant high-rate quantization strategies which lead to a large error exponent are determined. Numerical results indicate that the proposed quantization rule can yield better performance than existing ones in terms of detection error.Comment: 47 pages, 7 figures, 1 table. To appear in the IEEE Transactions on Information Theor

    Problems in Signal Processing and Inference on Graphs

    Get PDF
    Modern datasets are often massive due to the sharp decrease in the cost of collecting and storing data. Many are endowed with relational structure modeled by a graph, an object comprising a set of points and a set of pairwise connections between them. A ``signal on a graph'' has elements related to each other through a graph---it could model, for example, measurements from a sensor network. In this dissertation we study several problems in signal processing and inference on graphs. We begin by introducing an analogue to Heisenberg's time-frequency uncertainty principle for signals on graphs. We use spectral graph theory and the standard extension of Fourier analysis to graphs. Our spectral graph uncertainty principle makes precise the notion that a highly localized signal on a graph must have a broad spectrum, and vice versa. Next, we consider the problem of detecting a random walk on a graph from noisy observations. We characterize the performance of the optimal detector through the (type-II) error exponent, borrowing techniques from statistical physics to develop a lower bound exhibiting a phase transition. Strong performance is only guaranteed when the signal to noise ratio exceeds twice the random walk's entropy rate. Monte Carlo simulations show that the lower bound is quite close to the true exponent. Next, we introduce a technique for inferring the source of an epidemic from observations at a few nodes. We develop a Monte Carlo technique to simulate the infection process, and use statistics computed from these simulations to approximate the likelihood, which we then maximize to locate the source. We further introduce a logistic autoregressive model (ALARM), a simple model for binary processes on graphs that can still capture a variety of behavior. We demonstrate its simplicity by showing how to easily infer the underlying graph structure from measurements; a technique versatile enough that it can work under model mismatch. Finally, we introduce the exact formula for the error of the randomized Kaczmarz algorithm, a linear system solver for sparse systems, which often arise in graph theory. This is important because, as we show, existing performance bounds are quite loose.Engineering and Applied Sciences - Engineering Science

    Predictability of Extreme Events in Time Series

    Get PDF
    In this thesis we access the prediction of extreme events observing precursory structures, which were identified using a maximum likelihood approach. The main goal of this thesis is to investigate the dependence of the quality of a prediction on the magnitude of the events under study. Until now, this dependence was only sporadically reported for different phenomena without being understood as a general feature of predictions. We propose the magnitude dependence as a property of a prediction, indicating, whether larger events can be better, harder or equally well predicted than smaller events. Furthermore we specify a condition which can characterize the magnitude dependence of a distinguished measure for the quality of a prediction, the Receiver Operator characteristic curve (ROC). This test condition allows to relate the magnitude dependence of ap rediction task to the joint PDF of events and precursory variables. If we are able to describe the numerical estimate of this joint PDF by an analytic expression, we can not only characterize the magnitude dependence of events observed so far, but infer the magnitude dependence of events, larger then the observed events. Having the test condition specified, we study the magnitude dependence for the prediction of increments and threshold crossings in sequences of random variables and short- and long-range correlated stochastic processes. In dependence of the distribution of the process under study we obtain different magnitude dependences for the prediction of increments in Gaussian, exponentially, symmetrized exponentially, power-law and symmetrized power-law distributed processes. For threshold crossings we obtain the same magnitude dependence for all distributions studied. Furthermore we study the dependence on the event magnitude for the prediction of increments and threshold crossings in velocity increments, measured in a free jet flow and in wind-speed measurements. Additionally we introduce a method of post-processing the output of ensemble weather forecast models in order to identify precursory behavior, which could indicate failures of weather forecasts. We then study not only the success of this method, but also the magnitude dependence. keywords: extreme events, statistical inference, prediction via precursors, ROC curves, likelihood ratio, magnitude dependenc

    Multifractals and the temporal structure of rainfall

    Get PDF
    Rainfall is a highly non-linear hydrological process that exhibits wide variability over a broad range of time and space scales. The strongly irregular fluctuations of rain are difficult to capture instrumentally and to handle mathematically. The purpose of this work is to contribute to a better understanding of the variability of rainfall by investigating the multifractal behaviour that is present in the temporal structure of rainfall. This type of rainfall analysis is based on the invariance of properties across scales, and it takes into account the persistence of the variability of the process over a range of scales.The dissertation focuses on the analyses of point-rainfall data from 4 different locations in Europe. The data sets differ with respect to climatic origin, type of measuring device used, resolution of the data, and length of the records. The data are from recording and non-recording gauges. The highest resolution of the data is 1 minute, and the lowest is 1 month. The time span of the records varies from 4 years to 90 years.The presence of scale-invariant and multifractal properties in the rainfall process are investigated with spectral analysis, and by studying the multiple scaling of probability distributions and statistical moments of the rainfall intensity. This study shows that the temporal structure of rainfall exhibits these properties across a wide range of scales. Within the range of scales studied, it analyzes the presence of different scaling regimes and seasonal variation in the statistics of rainfall. The empirical multifractal scaling exponent functions that describe the statistics of the rainfall process are derived. Special attention is given to discontinuities in the empirical scaling functions that are caused by the finite size of the samples, the divergence of moments, and the dynamic and temporal resolution of the rainfall measuring devices and data. The critical exponents associated with these multifractal phase transitions are studied empirically.The applicability to rainfall of a theoretical multifractal model based on Lévy stochastic variables is studied. The adequacy of this model in describing the empirical scaling functions of rainfall is examined. Results indicate that it is possible to quantify the statistics of rainfall over a wide range of scales, and over a range of the process dynamics using only a few parameters. For an analysis of this type, it is essential to recognize the effects of such limitations as the sample size, and the type of acquisition of the experimental data and its resolution.This dissertation shows that multifractals offer a good framework for the analysis of the temporal structure of rainfall. It provides a good description of both the average and the extreme events. The expectation is that this type of studies will help in solving problems related to the choice of suitable resolutions for data collection and in making a correct assessment of the 'quality' of data sets.</p

    Proceedings of the Third Annual Symposium on Mathematical Pattern Recognition and Image Analysis

    Get PDF
    Topics addressed include: multivariate spline method; normal mixture analysis applied to remote sensing; image data analysis; classifications in spatially correlated environments; probability density functions; graphical nonparametric methods; subpixel registration analysis; hypothesis integration in image understanding systems; rectification of satellite scanner imagery; spatial variation in remotely sensed images; smooth multidimensional interpolation; and optimal frequency domain textural edge detection filters

    Handbook of Mathematical Geosciences

    Get PDF
    This Open Access handbook published at the IAMG's 50th anniversary, presents a compilation of invited path-breaking research contributions by award-winning geoscientists who have been instrumental in shaping the IAMG. It contains 45 chapters that are categorized broadly into five parts (i) theory, (ii) general applications, (iii) exploration and resource estimation, (iv) reviews, and (v) reminiscences covering related topics like mathematical geosciences, mathematical morphology, geostatistics, fractals and multifractals, spatial statistics, multipoint geostatistics, compositional data analysis, informatics, geocomputation, numerical methods, and chaos theory in the geosciences
    corecore