1,073 research outputs found

    Tuning Windowed Chi-Squared Detectors for Sensor Attacks

    Full text link
    A model-based windowed chi-squared procedure is proposed for identifying falsified sensor measurements. We employ the widely-used static chi-squared and the dynamic cumulative sum (CUSUM) fault/attack detection procedures as benchmarks to compare the performance of the windowed chi-squared detector. In particular, we characterize the state degradation that a class of attacks can induce to the system while enforcing that the detectors do not raise alarms (zero-alarm attacks). We quantify the advantage of using dynamic detectors (windowed chi-squared and CUSUM detectors), which leverages the history of the state, over a static detector (chi-squared) which uses a single measurement at a time. Simulations using a chemical reactor are presented to illustrate the performance of our tools

    Automatic spectral density estimation for random fields on a lattice via bootstrap.

    Get PDF
    We consider the nonparametric estimation of spectral densities for secondorder stationary random fields on a d-dimensional lattice. We discuss some drawbacks of standard methods and propose modified estimator classes with improved bias convergence rate, emphasizing the use of kernel methods and the choice of an optimal smoothing number.We prove the uniform consistency and study the uniform asymptotic distribution when the optimal smoothing number is estimated from the sampled data.Spatial data; Spectral density; Smoothing number; Uniform asymptotic distribution; Bootstrap;

    Automatic spectral density estimation for Random fields on a lattice via bootstrap

    Get PDF
    This paper considers the nonparametric estimation of spectral densities for second order stationary random fields on a d-dimensional lattice. I discuss some drawbacks of standard methods, and propose modified estimator classes with improved bias convergence rate, emphasizing the use of kernel methods and the choice of an optimal smoothing number. I prove uniform consistency and study the uniform asymptotic distribution, when the optimal smoothing number is estimated from the sampled data.

    The MVGC multivariate Granger causality toolbox: a new approach to Granger-causal inference

    Get PDF
    Background: Wiener-Granger causality (“G-causality”) is a statistical notion of causality applicable to time series data, whereby cause precedes, and helps predict, effect. It is defined in both time and frequency domains, and allows for the conditioning out of common causal influences. Originally developed in the context of econometric theory, it has since achieved broad application in the neurosciences and beyond. Prediction in the G-causality formalism is based on VAR (Vector AutoRegressive) modelling. New Method: The MVGC Matlab c Toolbox approach to G-causal inference is based on multiple equivalent representations of a VAR model by (i) regression parameters, (ii) the autocovariance sequence and (iii) the cross-power spectral density of the underlying process. It features a variety of algorithms for moving between these representations, enabling selection of the most suitable algorithms with regard to computational efficiency and numerical accuracy. Results: In this paper we explain the theoretical basis, computational strategy and application to empirical G-causal inference of the MVGC Toolbox. We also show via numerical simulations the advantages of our Toolbox over previous methods in terms of computational accuracy and statistical inference. Comparison with Existing Method(s): The standard method of computing G-causality involves estimation of parameters for both a full and a nested (reduced) VAR model. The MVGC approach, by contrast, avoids explicit estimation of the reduced model, thus eliminating a source of estimation error and improving statistical power, and in addition facilitates fast and accurate estimation of the computationally awkward case of conditional G-causality in the frequency domain. Conclusions: The MVGC Toolbox implements a flexible, powerful and efficient approach to G-causal inference. Keywords: Granger causality, vector autoregressive modelling, time series analysi

    Recurrent kernel machines : computing with infinite echo state networks

    Get PDF
    Echo state networks (ESNs) are large, random recurrent neural networks with a single trained linear readout layer. Despite the untrained nature of the recurrent weights, they are capable of performing universal computations on temporal input data, which makes them interesting for both theoretical research and practical applications. The key to their success lies in the fact that the network computes a broad set of nonlinear, spatiotemporal mappings of the input data, on which linear regression or classification can easily be performed. One could consider the reservoir as a spatiotemporal kernel, in which the mapping to a high-dimensional space is computed explicitly. In this letter, we build on this idea and extend the concept of ESNs to infinite-sized recurrent neural networks, which can be considered recursive kernels that subsequently can be used to create recursive support vector machines. We present the theoretical framework, provide several practical examples of recursive kernels, and apply them to typical temporal tasks
    corecore