43,285 research outputs found

    Distributed estimation from relative measurements of heterogeneous and uncertain quality

    Get PDF
    This paper studies the problem of estimation from relative measurements in a graph, in which a vector indexed over the nodes has to be reconstructed from pairwise measurements of differences between its components associated to nodes connected by an edge. In order to model heterogeneity and uncertainty of the measurements, we assume them to be affected by additive noise distributed according to a Gaussian mixture. In this original setup, we formulate the problem of computing the Maximum-Likelihood (ML) estimates and we design two novel algorithms, based on Least Squares regression and Expectation-Maximization (EM). The first algorithm (LS- EM) is centralized and performs the estimation from relative measurements, the soft classification of the measurements, and the estimation of the noise parameters. The second algorithm (Distributed LS-EM) is distributed and performs estimation and soft classification of the measurements, but requires the knowledge of the noise parameters. We provide rigorous proofs of convergence of both algorithms and we present numerical experiments to evaluate and compare their performance with classical solutions. The experiments show the robustness of the proposed methods against different kinds of noise and, for the Distributed LS-EM, against errors in the knowledge of noise parameters.Comment: Submitted to IEEE transaction

    Bias estimation in sensor networks

    Get PDF
    This paper investigates the problem of estimating biases affecting relative state measurements in a sensor network. Each sensor measures the relative states of its neighbors and this measurement is corrupted by a constant bias. We analyse under what conditions on the network topology and the maximum number of biased sensors the biases can be correctly estimated. We show that for non-bipartite graphs the biases can always be determined even when all the sensors are corrupted, while for bipartite graphs more than half of the sensors should be unbiased to ensure the correctness of the bias estimation. If the biases are heterogeneous, then the number of unbiased sensors can be reduced to two. Based on these conditions, we propose some algorithms to estimate the biases.Comment: 12 pages, 8 figure

    Probabilistic Inference from Arbitrary Uncertainty using Mixtures of Factorized Generalized Gaussians

    Full text link
    This paper presents a general and efficient framework for probabilistic inference and learning from arbitrary uncertain information. It exploits the calculation properties of finite mixture models, conjugate families and factorization. Both the joint probability density of the variables and the likelihood function of the (objective or subjective) observation are approximated by a special mixture model, in such a way that any desired conditional distribution can be directly obtained without numerical integration. We have developed an extended version of the expectation maximization (EM) algorithm to estimate the parameters of mixture models from uncertain training examples (indirect observations). As a consequence, any piece of exact or uncertain information about both input and output values is consistently handled in the inference and learning stages. This ability, extremely useful in certain situations, is not found in most alternative methods. The proposed framework is formally justified from standard probabilistic principles and illustrative examples are provided in the fields of nonparametric pattern classification, nonlinear regression and pattern completion. Finally, experiments on a real application and comparative results over standard databases provide empirical evidence of the utility of the method in a wide range of applications

    Debates—Stochastic subsurface hydrology from theory to practice: why stochastic modeling has not yet permeated into practitioners?

    Get PDF
    This is the peer reviewed version of the following article: [Sanchez-Vila, X., and D. Fernàndez-Garcia (2016), Debates—Stochastic subsurface hydrology from theory to practice: Why stochastic modeling has not yet permeated into practitioners?, Water Resour. Res., 52, 9246–9258, doi:10.1002/2016WR019302], which has been published in final form at http://onlinelibrary.wiley.com/doi/10.1002/2016WR019302/abstract. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-ArchivingWe address modern topics of stochastic hydrogeology from their potential relevance to real modeling efforts at the field scale. While the topics of stochastic hydrogeology and numerical modeling have become routine in hydrogeological studies, nondeterministic models have not yet permeated into practitioners. We point out a number of limitations of stochastic modeling when applied to real applications and comment on the reasons why stochastic models fail to become an attractive alternative for practitioners. We specifically separate issues corresponding to flow, conservative transport, and reactive transport. The different topics addressed are emphasis on process modeling, need for upscaling parameters and governing equations, relevance of properly accounting for detailed geological architecture in hydrogeological modeling, and specific challenges of reactive transport. We end up by concluding that the main responsible for nondeterministic models having not yet permeated in industry can be fully attributed to researchers in stochastic hydrogeology.Peer ReviewedPostprint (author's final draft

    Landslide risk management through spatial analysis and stochastic prediction for territorial resilience evaluation

    Get PDF
    Natural materials, such as soils, are influenced by many factors acting during their formative and evolutionary process: atmospheric agents, erosion and transport phenomena, sedimentation conditions that give soil properties a non-reducible randomness by using sophisticated survey techniques and technologies. This character is reflected not only in spatial variability of properties which differs from point to point, but also in multivariate correlation as a function of reciprocal distance. Cognitive enrichment, offered by the response of soils associated with their intrinsic spatial variability, implies an increase in the evaluative capacity of the contributing causes and potential effects in failure phenomena. Stability analysis of natural slopes is well suited to stochastic treatment of uncertainty which characterized landslide risk. In particular, this study has been applied through a back- analysis procedure to a slope located in Southern Italy that was subject to repeated phenomena of hydrogeological instability (extended for several kilometres in recent years). The back-analysis has been carried out by applying spatial analysis to the controlling factors as well as quantifying the hydrogeological hazard through unbiased estimators. A natural phenomenon, defined as stochastic process characterized by mutually interacting spatial variables, has led to identify the most critical areas, giving reliability to the scenarios and improving the forecasting content. Moreover, the phenomenological characterization allows the optimization of the risk levels to the wide territory involved, supporting decision-making process for intervention priorities as well as the effective allocation of the available resources in social, environmental and economic contexts
    • …
    corecore