404,282 research outputs found

    Evidential-EM Algorithm Applied to Progressively Censored Observations

    Get PDF
    Evidential-EM (E2M) algorithm is an effective approach for computing maximum likelihood estimations under finite mixture models, especially when there is uncertain information about data. In this paper we present an extension of the E2M method in a particular case of incom-plete data, where the loss of information is due to both mixture models and censored observations. The prior uncertain information is expressed by belief functions, while the pseudo-likelihood function is derived based on imprecise observations and prior knowledge. Then E2M method is evoked to maximize the generalized likelihood function to obtain the optimal estimation of parameters. Numerical examples show that the proposed method could effectively integrate the uncertain prior infor-mation with the current imprecise knowledge conveyed by the observed data

    Certainty Closure: Reliable Constraint Reasoning with Incomplete or Erroneous Data

    Full text link
    Constraint Programming (CP) has proved an effective paradigm to model and solve difficult combinatorial satisfaction and optimisation problems from disparate domains. Many such problems arising from the commercial world are permeated by data uncertainty. Existing CP approaches that accommodate uncertainty are less suited to uncertainty arising due to incomplete and erroneous data, because they do not build reliable models and solutions guaranteed to address the user's genuine problem as she perceives it. Other fields such as reliable computation offer combinations of models and associated methods to handle these types of uncertain data, but lack an expressive framework characterising the resolution methodology independently of the model. We present a unifying framework that extends the CP formalism in both model and solutions, to tackle ill-defined combinatorial problems with incomplete or erroneous data. The certainty closure framework brings together modelling and solving methodologies from different fields into the CP paradigm to provide reliable and efficient approches for uncertain constraint problems. We demonstrate the applicability of the framework on a case study in network diagnosis. We define resolution forms that give generic templates, and their associated operational semantics, to derive practical solution methods for reliable solutions.Comment: Revised versio

    Fuzzy heterogeneous neurons for imprecise classification problems

    Get PDF
    In the classical neuron model, inputs are continuous real-valued quantities. However, in many important domains from the real world, objects are described by a mixture of continuous and discrete variables, usually containing missing information and uncertainty. In this paper, a general class of neuron models accepting heterogeneous inputs in the form of mixtures of continuous (crisp and/or fuzzy) and discrete quantities admitting missing data is presented. From these, several particular models can be derived as instances and different neural architectures constructed with them. Such models deal in a natural way with problems for which information is imprecise or even missing. Their possibilities in classification and diagnostic problems are here illustrated by experiments with data from a real-world domain in the field of environmental studies. These experiments show that such neurons can both learn and classify complex data very effectively in the presence of uncertain information.Peer ReviewedPostprint (author's final draft
    • …
    corecore