6 research outputs found

    Probabilistic Inference from Arbitrary Uncertainty using Mixtures of Factorized Generalized Gaussians

    Full text link
    This paper presents a general and efficient framework for probabilistic inference and learning from arbitrary uncertain information. It exploits the calculation properties of finite mixture models, conjugate families and factorization. Both the joint probability density of the variables and the likelihood function of the (objective or subjective) observation are approximated by a special mixture model, in such a way that any desired conditional distribution can be directly obtained without numerical integration. We have developed an extended version of the expectation maximization (EM) algorithm to estimate the parameters of mixture models from uncertain training examples (indirect observations). As a consequence, any piece of exact or uncertain information about both input and output values is consistently handled in the inference and learning stages. This ability, extremely useful in certain situations, is not found in most alternative methods. The proposed framework is formally justified from standard probabilistic principles and illustrative examples are provided in the fields of nonparametric pattern classification, nonlinear regression and pattern completion. Finally, experiments on a real application and comparative results over standard databases provide empirical evidence of the utility of the method in a wide range of applications

    Probabilistic Inference from Arbitrary Uncertainty using Mixtures of Factorized Generalized Gaussians

    No full text
    This paper presents a general and efficient framework for probabilistic inference and learning from arbitrary uncertain information. It exploits the calculation properties of finite mixture models, conjugate families and factorization. Both the joint probability density of the variables and the likelihood function of the (objective or subjective) observation are approximated by a special mixture model, in such a way that any desired conditional distribution can be directly obtained without numerical integration. We have developed an extended version of the expectation maximization (EM) algorithm to estimate the parameters of mixture models from uncertain training examples (indirect observations). As a consequence, any piece of exact or uncertain information about both input and output values is consistently handled in the inference and learning stages. This ability, extremely useful in certain situations, is not found in most alternative methods. The proposed framework is formally just..

    Probabilistic Inference from Arbitrary Uncertainty using Mixtures of Factorized Generalized Gaussians

    No full text
    This paper presents a general and efficient framework for probabilistic inference and learning from arbitrary uncertain information. It exploits the calculation properties of finite mixture models, conjugate families and factorization. Both the joint probability density of the variables and the likelihood function of the (objective or subjective) observation are approximated by a special mixture model, in such a way that any desired conditional distribution can be directly obtained without numerical integration. We have developed an extended version of the expectation maximization (EM) algorithm to estimate the parameters of mixture models from uncertain training examples (indirect observations). As a consequence, any piece of exact or uncertain information about both input and output values is consistently handled in the inference and learning stages. This ability, extremely useful in certain situations, is not found in most alternative methods. The proposed framework is formally justified from standard probabilistic principles and illustrative examples are provided in the fields of nonparametric pattern classification, nonlinear regression and pattern completion. Finally, experiments on a real application and comparative results over standard databases provide empirical evidence of the utility of the method in a wide range of applications. 1

    Heuristic assignment of CPDs for probabilistic inference in junction trees

    Get PDF
    Many researches have been done for efficient computation of probabilistic queries posed to Bayesian networks (BN). One of the popular architectures for exact inference on BNs is the Junction Tree (JT) based architecture. Among all the different architectures developed, HUGIN is the most efficient JT-based architecture. The Global Propagation (GP) method used in the HUGIN architecture is arguably one of the best methods for probabilistic inference in BNs. Before the propagation, initialization is done to obtain the potential for each cluster in the JT. Then with the GP method, each cluster potential becomes cluster marginal through passing messages with its neighboring clusters. Improvements have been proposed by many researchers to make this message propagation more efficient. Still the GP method can be very slow for dense networks. As BNs are applied to larger, more complex, and realistic applications, developing more efficient inference algorithm has become increasingly important. Towards this goal, in this paper, we present some heuristics for initialization that avoids unnecessary message passing among clusters of the JT and therefore it improves the performance of the architecture by passing lesser messages

    CernoCAMAL : a probabilistic computational cognitive architecture

    Get PDF
    This thesis presents one possible way to develop a computational cognitive architecture, dubbed CernoCAMAL, that can be used to govern artificial minds probabilistically. The primary aim of the CernoCAMAL research project is to investigate how its predecessor architecture CAMAL can be extended to reason probabilistically about domain model objects through perception, and how the probability formalism can be integrated into its BDI (Belief-Desire-Intention) model to coalesce a number of mechanisms and processes. The motivation and impetus for extending CAMAL and developing CernoCAMAL is the considerable evidence that probabilistic thinking and reasoning is linked to cognitive development and plays a role in cognitive functions, such as decision making and learning. This leads us to believe that a probabilistic reasoning capability is an essential part of human intelligence. Thus, it should be a vital part of any system that attempts to emulate human intelligence computationally. The extensions and augmentations to CAMAL, which are the main contributions of the CernoCAMAL research project, are as follows: - The integration of the EBS (Extended Belief Structure) that associates a probability value with every belief statement, in order to represent the degrees of belief numerically. - The inclusion of the CPR (CernoCAMAL Probabilistic Reasoner) that reasons probabilistically over the goal- and task-oriented perceptual feedback generated by reactive sub-systems. - The compatibility of the probabilistic BDI model with the affect and motivational models and affective and motivational valences used throughout CernoCAMAL. A succession of experiments in simulation and robotic testbeds is carried out to demonstrate improvements and increased efficacy in CernoCAMAL’s overall cognitive performance. A discussion and critical appraisal of the experimental results, together with a summary, a number of potential future research directions, and some closing remarks conclude the thesis

    CernoCAMAL : a probabilistic computational cognitive architecture

    Get PDF
    This thesis presents one possible way to develop a computational cognitive architecture, dubbed CernoCAMAL, that can be used to govern artificial minds probabilistically. The primary aim of the CernoCAMAL research project is to investigate how its predecessor architecture CAMAL can be extended to reason probabilistically about domain model objects through perception, and how the probability formalism can be integrated into its BDI (Belief-Desire-Intention) model to coalesce a number of mechanisms and processes.The motivation and impetus for extending CAMAL and developing CernoCAMAL is the considerable evidence that probabilistic thinking and reasoning is linked to cognitive development and plays a role in cognitive functions, such as decision making and learning. This leads us to believe that a probabilistic reasoning capability is an essential part of human intelligence. Thus, it should be a vital part of any system that attempts to emulate human intelligence computationally.The extensions and augmentations to CAMAL, which are the main contributions of the CernoCAMAL research project, are as follows:- The integration of the EBS (Extended Belief Structure) that associates a probability value with every belief statement, in order to represent the degrees of belief numerically.- The inclusion of the CPR (CernoCAMAL Probabilistic Reasoner) that reasons probabilistically over the goal- and task-oriented perceptual feedback generated by reactive sub-systems.- The compatibility of the probabilistic BDI model with the affect and motivational models and affective and motivational valences used throughout CernoCAMAL.A succession of experiments in simulation and robotic testbeds is carried out to demonstrate improvements and increased efficacy in CernoCAMAL’s overall cognitive performance. A discussion and critical appraisal of the experimental results, together with a summary, a number of potential future research directions, and some closing remarks conclude the thesis
    corecore