1,085 research outputs found

    A belief function theory based approach to combining different representation of uncertainty in prognostics

    Get PDF
    International audienceIn this work, we consider two prognostic approaches for the prediction of the remaining useful life (RUL) of degrading equipment. The first approach is based on Gaussian Process Regression (GPR) and provides the probability distribution of the equipment RUL; the second approach adopts a Similarity-Based Regression (SBR) method for the RUL prediction and belief function theory for modeling the uncertainty on the prediction. The performance of the two approaches is comparable and we propose a method for combining their outcomes in an ensemble. The least commitment principle is adopted to transform the RUL probability density function supplied by the GPR method into a belief density function. Then, the Dempster's rule is used to aggregate the belief assignments provided by the GPR and the SBR approaches. The ensemble method is applied to the problem of predicting the RUL of filters used to clean the sea water entering the condenser of the boiling water reactor (BWR) in a Swedish nuclear power plant. The results by the ensemble method are shown to be more satisfactory than that provided by the individual GPR and SBR approaches from the point of view of the representation of the uncertainty in the RUL prediction

    Continuous belief functions and α-stable distributions

    No full text
    International audienceThe theory of belief functions has been formalized in continuous domain for pattern recognition. Some applications use assumption of Gaussian models. However, this assumption is reductive. Indeed, some data are not symmetric and present property of heavy tails. It is possible to solve these problems by using a class of distributions called α-stable distributions. Consequently, we present in this paper a way to calculate pignistic probabilities with plausibility functions where the knowledge of the sources of information is represented by symmetric α-stable distributions. To validate our approach, we compare our results in special case of Gaussian distributions with existing methods. To illustrate our work, we generate arbitrary distributions which represents speed of planes and take decisions. A comparison with a Bayesian approach is made to show the interest of the theory of belief functions

    Estimation d'un mélange de distributions alpha-stables à partir de l'algorithme EM

    No full text
    International audienceLe modèle Gaussien est souvent utilisé dans de nombreuses applications. Cependant, cette hypothèse est réductrice. Par exemple, il est possible que les données fournies par des capteurs ne soient pas symétriques et/ou présentent une décroissance rapide au niveau de la queue de la distribution. De plus, il est rare que la densité de probabilité représentant les données soit unimodale. Il existe des algorithmes permettant l'estimation d'un mélange de distributions. L'algorithme Espérance-Maximisation (EM) permet entre autre d'estimer un mélange de distributions Gaussiennes. Nous proposons dans ce papier d'étendre l'algorithme EM pour estimer un mélange de distributions α-stables. Un des objectifs futurs de ce papier est d'appliquer la notion de fonctions de croyance continues sachant que les informations fournies par les sources peuvent être modélisées par un mélange de densité de probabilité α-stables

    Multi-sensing Data Fusion: Target tracking via particle filtering

    Get PDF
    In this Master's thesis, Multi-sensing Data Fusion is firstly introduced with a focus on perception and the concepts that are the base of this work, like the mathematical tools that make it possible. Particle filters are one class of these tools that allow a computer to perform fusion of numerical information that is perceived from real environment by sensors. For this reason they are described and state of the art mathematical formulas and algorithms for particle filtering are also presented. At the core of this project, a simple piece of software has been developed in order to test these tools in practice. More specifically, a Target Tracking Simulator software is presented where a virtual trackable object can freely move in a 2-dimensional simulated environment and distributed sensor agents, dispersed in the same environment, should be able to perceive the object through a state-dependent measurement affected by additive Gaussian noise. Each sensor employs particle filtering along with communication with other neighboring sensors in order to update the perceived state of the object and track it as it moves in the environment. The combination of Java and AgentSpeak languages is used as a platform for the development of this application

    PHYSTAT-LHC Workshop on Statistical Issues for LHC Physics

    Get PDF
    A PHYSTAT workshop on the topic of Statistical issues for LHC physics was held at CERN. The workshop focused on issues related to discovery that we hope will be relevant to the LHC. These proceedings contain written versions of nearly all the talks, several of which were given by professional statisticians. The talks varied from general overviews, to those describing searches for specific particles. The treatment of background uncertainties figured prominently. Many of the talks describing search strategies for new effects should be of interest not only to particle physicists but also to scientists in other fields

    Dynamic Programming and Bayesian Inference

    Get PDF
    Dynamic programming and Bayesian inference have been both intensively and extensively developed during recent years. Because of these developments, interest in dynamic programming and Bayesian inference and their applications has greatly increased at all mathematical levels. The purpose of this book is to provide some applications of Bayesian optimization and dynamic programming

    Context-dependent fusion with application to landmine detection.

    Get PDF
    Traditional machine learning and pattern recognition systems use a feature descriptor to describe the sensor data and a particular classifier (also called expert or learner ) to determine the true class of a given pattern. However, for complex detection and classification problems, involving data with large intra-class variations and noisy inputs, no single source of information can provide a satisfactory solution. As a result, combination of multiple classifiers is playing an increasing role in solving these complex pattern recognition problems, and has proven to be viable alternative to using a single classifier. In this thesis we introduce a new Context-Dependent Fusion (CDF) approach, We use this method to fuse multiple algorithms which use different types of features and different classification methods on multiple sensor data. The proposed approach is motivated by the observation that there is no single algorithm that can consistently outperform all other algorithms. In fact, the relative performance of different algorithms can vary significantly depending on several factions such as extracted features, and characteristics of the target class. The CDF method is a local approach that adapts the fusion method to different regions of the feature space. The goal is to take advantages of the strengths of few algorithms in different regions of the feature space without being affected by the weaknesses of the other algorithms and also avoiding the loss of potentially valuable information provided by few weak classifiers by considering their output as well. The proposed fusion has three main interacting components. The first component, called Context Extraction, partitions the composite feature space into groups of similar signatures, or contexts. Then, the second component assigns an aggregation weight to each detector\u27s decision in each context based on its relative performance within the context. The third component combines the multiple decisions, using the learned weights, to make a final decision. For Context Extraction component, a novel algorithm that performs clustering and feature discrimination is used to cluster the composite feature space and identify the relevant features for each cluster. For the fusion component, six different methods were proposed and investigated. The proposed approached were applied to the problem of landmine detection. Detection and removal of landmines is a serious problem affecting civilians and soldiers worldwide. Several detection algorithms on landmine have been proposed. Extensive testing of these methods has shown that the relative performance of different detectors can vary significantly depending on the mine type, geographical site, soil and weather conditions, and burial depth, etc. Therefore, multi-algorithm, and multi-sensor fusion is a critical component in land mine detection. Results on large and diverse real data collections show that the proposed method can identify meaningful and coherent clusters and that different expert algorithms can be identified for the different contexts. Our experiments have also indicated that the context-dependent fusion outperforms all individual detectors and several global fusion methods
    • …
    corecore