172,309 research outputs found
Introduction to the special issue on probability, logic and learning
Recently, the combination of probability, logic and learning has received considerable attention in the artificial intelligence and machine learning communities; see e.g. Getoor and Taskar (2007); De Raedt et al. (2008). Computational logic often plays a major role in these developments since it forms the theoretical backbone for much of the work in probabilistic programming and logical and relational learning. Contemporary work in this area is often application- and experiment-driven, but is also concerned with the theoretical foundations of formalisms and inference procedures and with advanced implementation technology that scales well
Recommended from our members
Hierarchical classification for multiple, distributed web databases
The proliferation of online information resources increases the importance of effective and efficient distributed searching. Our research aims to provide an alternative hierarchical categorization and search capability based on a Bayesian network learning algorithm. Our proposed approach, which is grounded on automatic textual analysis of subject content of online web databases, attempts to address the database selection problem by first classifying web databases into a hierarchy of topic categories. The experimental results reported demonstrate that such a classification approach not only effectively reduces the class search space, but also helps to significantly improve the accuracy of classification performance
Probabilistic Dynamic Logic of Phenomena and Cognition
The purpose of this paper is to develop further the main concepts of
Phenomena Dynamic Logic (P-DL) and Cognitive Dynamic Logic (C-DL), presented in
the previous paper. The specific character of these logics is in matching
vagueness or fuzziness of similarity measures to the uncertainty of models.
These logics are based on the following fundamental notions: generality
relation, uncertainty relation, simplicity relation, similarity maximization
problem with empirical content and enhancement (learning) operator. We develop
these notions in terms of logic and probability and developed a Probabilistic
Dynamic Logic of Phenomena and Cognition (P-DL-PC) that relates to the scope of
probabilistic models of brain. In our research the effectiveness of suggested
formalization is demonstrated by approximation of the expert model of breast
cancer diagnostic decisions. The P-DL-PC logic was previously successfully
applied to solving many practical tasks and also for modelling of some
cognitive processes.Comment: 6 pages, WCCI 2010 IEEE World Congress on Computational Intelligence
July, 18-23, 2010 - CCIB, Barcelona, Spain, IJCNN, IEEE Catalog Number:
CFP1OUS-DVD, ISBN: 978-1-4244-6917-8, pp. 3361-336
Introduction to the 28th International Conference on Logic Programming Special Issue
We are proud to introduce this special issue of the Journal of Theory and
Practice of Logic Programming (TPLP), dedicated to the full papers accepted for
the 28th International Conference on Logic Programming (ICLP). The ICLP
meetings started in Marseille in 1982 and since then constitute the main venue
for presenting and discussing work in the area of logic programming
Probabilistic and fuzzy reasoning in simple learning classifier systems
This paper is concerned with the general stimulus-response problem as addressed by a variety of simple learning c1assifier systems (CSs). We suggest a theoretical model from which the assessment of uncertainty emerges as primary concern. A number of representation schemes borrowing from fuzzy logic theory are reviewed, and sorne connections with a well-known neural architecture revisited. In pursuit of the uncertainty measuring goal, usage of explicit probability distributions in the action part of c1assifiers is advocated. Sorne ideas supporting the design of a hybrid system incorpo'rating bayesian learning on top of the CS basic algorithm are sketched
Bayesian Logic Programs
Bayesian networks provide an elegant formalism for representing and reasoning
about uncertainty using probability theory. Theyare a probabilistic extension
of propositional logic and, hence, inherit some of the limitations of
propositional logic, such as the difficulties to represent objects and
relations. We introduce a generalization of Bayesian networks, called Bayesian
logic programs, to overcome these limitations. In order to represent objects
and relations it combines Bayesian networks with definite clause logic by
establishing a one-to-one mapping between ground atoms and random variables. We
show that Bayesian logic programs combine the advantages of both definite
clause logic and Bayesian networks. This includes the separation of
quantitative and qualitative aspects of the model. Furthermore, Bayesian logic
programs generalize both Bayesian networks as well as logic programs. So, many
ideas developedComment: 52 page
A Probabilistic Logic Programming Event Calculus
We present a system for recognising human activity given a symbolic
representation of video content. The input of our system is a set of
time-stamped short-term activities (STA) detected on video frames. The output
is a set of recognised long-term activities (LTA), which are pre-defined
temporal combinations of STA. The constraints on the STA that, if satisfied,
lead to the recognition of a LTA, have been expressed using a dialect of the
Event Calculus. In order to handle the uncertainty that naturally occurs in
human activity recognition, we adapted this dialect to a state-of-the-art
probabilistic logic programming framework. We present a detailed evaluation and
comparison of the crisp and probabilistic approaches through experimentation on
a benchmark dataset of human surveillance videos.Comment: Accepted for publication in the Theory and Practice of Logic
Programming (TPLP) journa
- …