1,558 research outputs found

    Evaluation of the Performance of the Markov Blanket Bayesian Classifier Algorithm

    Full text link
    The Markov Blanket Bayesian Classifier is a recently-proposed algorithm for construction of probabilistic classifiers. This paper presents an empirical comparison of the MBBC algorithm with three other Bayesian classifiers: Naive Bayes, Tree-Augmented Naive Bayes and a general Bayesian network. All of these are implemented using the K2 framework of Cooper and Herskovits. The classifiers are compared in terms of their performance (using simple accuracy measures and ROC curves) and speed, on a range of standard benchmark data sets. It is concluded that MBBC is competitive in terms of speed and accuracy with the other algorithms considered.Comment: 9 pages: Technical Report No. NUIG-IT-011002, Department of Information Technology, National University of Ireland, Galway (2002

    A Survey of Monte Carlo Tree Search Methods

    Get PDF
    Monte Carlo tree search (MCTS) is a recently proposed search method that combines the precision of tree search with the generality of random sampling. It has received considerable interest due to its spectacular success in the difficult problem of computer Go, but has also proved beneficial in a range of other domains. This paper is a survey of the literature to date, intended to provide a snapshot of the state of the art after the first five years of MCTS research. We outline the core algorithm's derivation, impart some structure on the many variations and enhancements that have been proposed, and summarize the results from the key game and nongame domains to which MCTS methods have been applied. A number of open research questions indicate that the field is ripe for future work

    CGAMES'2009

    Get PDF

    The SEC-system : reuse support for scheduling system development

    Get PDF
    Recently, in a joint cooperation of Stichting VNA, SAL Apotheken, the Faculty of Management and Organization, and the University Centre for Pharmacy, University of Groningen in the Netherlands, a Ph.D-study started regarding Apot(he)ek, Organization and Management (APOM). The APOM-project deals with the structuring and steering of pharmacy organization. The manageability of the internal pharmacy organization, and the manageability of the direct environment of pharmacy organization is the subject matter. The theoretical background of the APOM-project is described. A literature study was made to find mixes of objectives. Three mixes of objectives in pharmacy organization are postulated; the product mix, the process mix, and the customer mix. The typology will be used as a basic starting point for the empirical study in the next phase of the APOM-project.

    Collaborative computer personalities in the game of chess

    Get PDF
    Computer chess has played a crucial role in Artificial Intelligence research since the creation of the modem computer. It has gained this prominent position due to the large domain that it encompasses, including psychology, philosophy and computer science. The new and innovative techniques initially created for computer chess have often been successfully transferred to other divergent research areas such as theorem provers and economic models. The progress achieved by computers in the game of chess has been illustrated by Deep Blue’s famous victory over Garry Kasparov in 1997. However, further improvements are required if more complex problems are to be solved. In 1999 the Kasparov versus the World match took place over the Internet. The match allowed chess players from around the world to collaborate in a single game of chess against the then world champion, Garry Kasparov. The game was closely fought with Kasparov coming out on top. One of the most surprising aspects of the contest was the high quality of play achieved by the World team. The World team consisted of players with varying skill and style of play, despite this they achieved a level of play that was considered better than any of its individual members. The purpose of this research is to investigate if collaboration by different players can be successfully transferred to the domain of computer chess

    Development and Implementation of a Direct Evaluation Solution for Fault Tree Analyses Competing With Traditional Minimal Cut Sets Methods

    Get PDF
    Fault tree analysis (FTA) is a well-established technique to analyze the safety risks of a system. Two specific prominent FTA methods, largely applied in the aerospace field, are the so-called minimal cut sets (MCS), which uses an approximate evaluation of the problem, and the direct evaluation (DE) of the fault tree, which uses a top-down recursive algorithm. The first approach is only valid for small values of basic event probabilities and has historically yielded faster results than exact solutions for complex fault trees. The second one means exact solutions at a higher computational cost. This article presents several improvements applied to both approaches in order to upgrade the computing performance. First, improvements to the MCS approach have been performed, where the main idea has been to optimize the number of required permutations and to take advantage of the available information from previous solved subsets. Second, improvements to the DE approach have been applied, which deal with a reduction of the number of recursive calls through a deep search for independent events in the fault tree. This could dramatically reduce the computation time for industrial fault trees with a high number of repeated events. Additional implementation improvements have been also applied regarding hash tables, and memory access and usage, but also implementing the so-called “virtual gates”, which enable limitless children on each gate. The results presented hereafter are promising, not only because they show that both approaches have been highly optimized compared to the literature, but also because a DE solution has been achieved, which can compete in time resources (and obviously in precision) with the MCS approach. These improvements are relevant when considering the industrial, and more specifically the aeronautical, implementation and application of both techniques.The author Jordi Pons‐Prats acknowledges the support from Serra Hunter programme, Generalitat de Catalunya, as well as the support through the Severo Ochoa Centre of Excellence (2019‐2023) under the grant CEX2018‐000797‐S funded by MCIN/AEI/10.13039/501100011033.Postprint (author's final draft

    Accelerated focused crawling through online relevance feedback

    Get PDF
    The organization of HTML into a tag tree structure, which is rendered by browsers as roughly rectangular regions with embedded text and HREF links, greatly helps surfers locate and click on links that best satisfy their information need. Can an automatic program emulate this human behavior and thereby learn to predict the relevance of an unseen HREF target page w.r.t. an information need, based on information limited to the HREF source page? Such a capability would be of great interest in focused crawling and resource discovery, because it can fine-tune the priority of unvisited URLs in the crawl frontier, and reduce the number of irrelevant pages which are fetched and discarded
    • 

    corecore