4 research outputs found

    Using Sub-optimal Kalman Filtering for Anomaly Detection in Networks

    Get PDF
    Possibility theory can be used as a suitable frameworkto build a normal behavioral model for an anomaly detector.Based on linear and/or nonlinear systems, sub-optimal filteringapproaches based on the Extended Kalman Filter and the UnscentedKalman Filter are calibrated for entropy reduction andcould be a good basis to find a suitable model to build a decisionvariable where, a decision process can be applied to identifyanomalous events. Sophisticated fuzzy clustering algorithms canbe used to find a set of clusters built on the decision variable,where anomalies might happen inside a few of them. To achievean efficient detection step, a robust decision scheme is built, bymeans of possibility distributions, to separate the clusters intonormal and abnormal spaces. We had studied the false alarmrate vs. detection rate trade-off by means of ROC (ReceiverOperating Characteristic) curves to show the results. We validatethe approach over different realistic network traffic

    A robust forecasting framework based on the Kalman filtering approach with a twofold parameter tuning procedure: Application to solar and photovoltaic prediction

    Get PDF
    International audienceThis paper presents a framework which relies on the linear dynamical Kalman filter to perform a reliable prediction for solar and photovoltaic production. The method is convenient for real-time forecasting and we describe its use to perform these predictions for different time horizons, between one minute and one hour ahead. The dataset used is a set of measurements of solar irradiance and PV power production measured in a subtropical zone: Guadeloupe. In this zone, fluctuating meteorological conditions can occur, with highly variable atmospheric events having severe impact in the solar irradiance and the PV power. In such conditions, heterogeneous ramp events are observed making difficult to control and manage these sources of energy. The present work hopes to build a suitable statistical method, based on bayesian inference and state-space modeling, able to predict the evolution of solar radiation and PV production. We develop a forecast method based on the Kalman filter combined with a robust parameter estimation procedure built with an Auto Regressive model or with an Expectation–Maximisation algorithm. The model is built to run with univariate or multivariate data according to their availability. The model is used here to forecast the univariate solar and PV data and also PV with exogenous data such as cloud cover and air temperature. The accuracy of this technique is studied with a set of performance criterion including the root mean square error and the mean bias error. We compare the results for the different tests performed, from one minute to one hour ahead, to the simple persistence model. The performance of our technique exceeds by far the traditional persistence model with a skill score improvement around 39% and 31%, respectively for PV production and GHI, for one hour ahead forecast

    A Robust Anomaly Detection Technique Using Combined Statistical Methods

    Get PDF
    International audienceParametric anomaly detection is generally a three steps process where, in the first step a model of normal behavior is calibrated and thereafter, the obtained model is used in order to reduce the entropy of the observation. The second step generates an innovation process that is used in the third step to make a decision on the existence or not of an anomaly in the observed data. Under favorable conditions the innovation process is expected to be a Gaussian white noise. However, in practice, this is hardly the case as frequently the observed signals are not gaussian themselves. Moreover long range dependencies, as well as heavy tail in the observation can lead to important deviation from the normality and the independence in the innovation processes. This, results in the frequent observation that the decisions made assuming that the innovation process is a white and Gaussian results in a large false positive rate. In this paper we deal with the above issue. Our approach consists of not assuming anymore that the innovation process is Gaussian and white. In place we are assuming that the real distribution of the process is a mixture of Gaussian and that there are some time dependency in the innovation that we will capture by using a Hidden Markov Model. We therefore derive a new decision process and we show that this approach results into an important decrease of false alarm rates. We validate this approach over realistic traces
    corecore