567 research outputs found

    Subsumption Algorithms for Three-Valued Geometric Resolution

    Full text link
    In our implementation of geometric resolution, the most costly operation is subsumption testing (or matching): One has to decide for a three-valued, geometric formula, if this formula is false in a given interpretation. The formula contains only atoms with variables, equality, and existential quantifiers. The interpretation contains only atoms with constants. Because the atoms have no term structure, matching for geometric resolution is hard. We translate the matching problem into a generalized constraint satisfaction problem, and discuss several approaches for solving it efficiently, one direct algorithm and two translations to propositional SAT. After that, we study filtering techniques based on local consistency checking. Such filtering techniques can a priori refute a large percentage of generalized constraint satisfaction problems. Finally, we adapt the matching algorithms in such a way that they find solutions that use a minimal subset of the interpretation. The adaptation can be combined with every matching algorithm. The techniques presented in this paper may have applications in constraint solving independent of geometric resolution.Comment: This version was revised on 18.05.201

    Rational physical agent reasoning beyond logic

    No full text
    The paper addresses the problem of defining a theoretical physical agent framework that satisfies practical requirements of programmability by non-programmer engineers and at the same time permitting fast realtime operation of agents on digital computer networks. The objective of the new framework is to enable the satisfaction of performance requirements on autonomous vehicles and robots in space exploration, deep underwater exploration, defense reconnaissance, automated manufacturing and household automation

    COLAB : a hybrid knowledge representation and compilation laboratory

    Get PDF
    Knowledge bases for real-world domains such as mechanical engineering require expressive and efficient representation and processing tools. We pursue a declarative-compilative approach to knowledge engineering. While Horn logic (as implemented in PROLOG) is well-suited for representing relational clauses, other kinds of declarative knowledge call for hybrid extensions: functional dependencies and higher-order knowledge should be modeled directly. Forward (bottom-up) reasoning should be integrated with backward (top-down) reasoning. Constraint propagation should be used wherever possible instead of search-intensive resolution. Taxonomic knowledge should be classified into an intuitive subsumption hierarchy. Our LISP-based tools provide direct translators of these declarative representations into abstract machines such as an extended Warren Abstract Machine (WAM) and specialized inference engines that are interfaced to each other. More importantly, we provide source-to-source transformers between various knowledge types, both for user convenience and machine efficiency. These formalisms with their translators and transformers have been developed as part of COLAB, a compilation laboratory for studying what we call, respectively, "vertical\u27; and "horizontal\u27; compilation of knowledge, as well as for exploring the synergetic collaboration of the knowledge representation formalisms. A case study in the realm of mechanical engineering has been an important driving force behind the development of COLAB. It will be used as the source of examples throughout the paper when discussing the enhanced formalisms, the hybrid representation architecture, and the compilers

    Progress Report : 1991 - 1994

    Get PDF

    Learning visual docking for non-holonomic autonomous vehicles

    Get PDF
    This paper presents a new method of learning visual docking skills for non-holonomic vehicles by direct interaction with the environment. The method is based on a reinforcement algorithm, which speeds up Q-learning by applying memorybased sweeping and enforcing the ā€œadjoining propertyā€, a filtering mechanism to only allow transitions between states that satisfy a fixed distance. The method overcomes some limitations of reinforcement learning techniques when they are employed in applications with continuous non-linear systems, such as car-like vehicles. In particular, a good approximation to the optimal behaviour is obtained by a small look-up table. The algorithm is tested within an image-based visual servoing framework on a docking task. The training time was less than 1 hour on the real vehicle. In experiments, we show the satisfactory performance of the algorithm

    ARC-TEC : acquisition, representation and compilation of technical knowledge

    Get PDF
    A global description of an expert system shell for the domain of mechanical engineering is presented. The ARC-TEC project constitutes an AI approach to realize the CIM idea. Along with conceptual solutions, it provides a continuous sequence of software tools for the acquisition, representation and compilation of technical knowledge. The shell combines the KADS knowledge-acquisition methodology, the KL-ONE representation theory and the WAM compilation technology. For its evaluation a prototypical expert system for production planning is developed. A central part of the system is a knowledge base formalizing the relevant aspects of common sense in mechanical engineering. Thus, ARC-TEC is less general than the CYC project but broader than specific expert systems for planning or diagnosis

    Fuzzy Description Logics with General Concept Inclusions

    Get PDF
    Description logics (DLs) are used to represent knowledge of an application domain and provide standard reasoning services to infer consequences of this knowledge. However, classical DLs are not suited to represent vagueness in the description of the knowledge. We consider a combination of DLs and Fuzzy Logics to address this task. In particular, we consider the t-norm-based semantics for fuzzy DLs introduced by HƔjek in 2005. Since then, many tableau algorithms have been developed for reasoning in fuzzy DLs. Another popular approach is to reduce fuzzy ontologies to classical ones and use existing highly optimized classical reasoners to deal with them. However, a systematic study of the computational complexity of the different reasoning problems is so far missing from the literature on fuzzy DLs. Recently, some of the developed tableau algorithms have been shown to be incorrect in the presence of general concept inclusion axioms (GCIs). In some fuzzy DLs, reasoning with GCIs has even turned out to be undecidable. This work provides a rigorous analysis of the boundary between decidable and undecidable reasoning problems in t-norm-based fuzzy DLs, in particular for GCIs. Existing undecidability proofs are extended to cover large classes of fuzzy DLs, and decidability is shown for most of the remaining logics considered here. Additionally, the computational complexity of reasoning in fuzzy DLs with semantics based on finite lattices is analyzed. For most decidability results, tight complexity bounds can be derived

    Capture and Maintenance of Constraints in Engineering Design

    Get PDF
    The thesis investigates two domains, initially the kite domain and then part of a more demanding Rolls-Royce domain (jet engine design). Four main types of refinement rules that use the associated application conditions and domain ontology to support the maintenance of constraints are proposed. The refinement rules have been implemented in ConEditor and the extended system is known as ConEditor+. With the help of ConEditor+, the thesis demonstrates that an explicit representation of application conditions together with the corresponding constraints and the domain ontology can be used to detect inconsistencies, redundancy, subsumption and fusion, reduce the number of spurious inconsistencies and prevent the identification of inappropriate refinements of redundancy, subsumption and fusion between pairs of constraints.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Towards Next Generation Sequential and Parallel SAT Solvers

    Get PDF
    This thesis focuses on improving the SAT solving technology. The improvements focus on two major subjects: sequential SAT solving and parallel SAT solving. To better understand sequential SAT algorithms, the abstract reduction system Generic CDCL is introduced. With Generic CDCL, the soundness of solving techniques can be modeled. Next, the conflict driven clause learning algorithm is extended with the three techniques local look-ahead, local probing and all UIP learning that allow more global reasoning during search. These techniques improve the performance of the sequential SAT solver Riss. Then, the formula simplification techniques bounded variable addition, covered literal elimination and an advanced cardinality constraint extraction are introduced. By using these techniques, the reasoning of the overall SAT solving tool chain becomes stronger than plain resolution. When using these three techniques in the formula simplification tool Coprocessor before using Riss to solve a formula, the performance can be improved further. Due to the increasing number of cores in CPUs, the scalable parallel SAT solving approach iterative partitioning has been implemented in Pcasso for the multi-core architecture. Related work on parallel SAT solving has been studied to extract main ideas that can improve Pcasso. Besides parallel formula simplification with bounded variable elimination, the major extension is the extended clause sharing level based clause tagging, which builds the basis for conflict driven node killing. The latter allows to better identify unsatisfiable search space partitions. Another improvement is to combine scattering and look-ahead as a superior search space partitioning function. In combination with Coprocessor, the introduced extensions increase the performance of the parallel solver Pcasso. The implemented system turns out to be scalable for the multi-core architecture. Hence iterative partitioning is interesting for future parallel SAT solvers. The implemented solvers participated in international SAT competitions. In 2013 and 2014 Pcasso showed a good performance. Riss in combination with Copro- cessor won several first, second and third prices, including two Kurt-Gƶdel-Medals. Hence, the introduced algorithms improved modern SAT solving technology
    • ā€¦
    corecore