2 research outputs found

    Bayesian Markov Logic Networks - Bayesian Inference for Statistical Relational Learning

    No full text
    One of the most important foundational challenge of Statistical relational learning is the development of a uniform framework in which learning and logical reasoning are seamlessly integrated. State of the art approaches propose to modify well known machine learning methods based on parameter optimization (e.g., neural networks and graphical models) in order to take into account structural knowledge expressed by logical constraints. In this paper, we follow an alternative direction, considering the Bayesian approach to machine learning. In particular, given a partial knowledge in hybrid domains (i.e., domains that contains relational structure and continuous features) as a set  of axioms and a stochastic (in)dependence hypothesis ℱ encoded in a first order language , we propose to model it by a probability distribution function (PDF) (∣,ℱ) over the -interpretations . The stochastic (in)dependence ℱ is represented as a Bayesian Markov Logic Network w.r.t. a parametric undirected graph, interpreted as the PDF. We propose to approximate (∣,ℱ) by variational inference and show that such approximation is possible if and only if ℱ satisfies a property called orthogonality. This property can be achieved also by extending , and adjusting  and ℱ
    corecore