16 research outputs found

    Solving Symbolic Equations with PRESS

    Get PDF
    We outline a program, PRESS (PRolog Equation Solving System) for solving symbolic, transcendental, non-differential equations. The methods used for solving equations are described, together with the service facilities. The principal technique, meta-level inference, appears to have applications in the broader field of symbolic and algebraic manipulation

    The use of proof plans to sum series

    Full text link

    The Use of Proof Plans to Sum Series

    Get PDF
    We describe a program for finding closed form solutions to finite sums. The program was built to test the applicability of the proof planning search control technique in a domain of mathematics outwith induction. This experiment was successful. The series summing program extends previous work in this area and was built in a short time just by providing new series summing methods to our existing inductive theorem proving system CLAM. One surprising discovery was the usefulness of the ripple tactic in summing series. Rippling is the key tactic for controlling inductive proofs, and was previously thought to be specialised to such proofs. However, it turns out to be the key sub-tactic used by all the main tactics for summing series. The only change required was that it had to be supplemented by a difference matching algorithm to set up some initial meta-level annotations to guide the rippling process. In inductive proofs these annotations are provided by the application of mathematical induction. This evidence suggests that rippling, supplemented by difference matching, will find wide application in controlling mathematical proofs. The research reported in this paper was supported by SERC grant GR/F/71799, a SERC PostDoctoral Fellowship to the first author and a SERC Senior Fellowship to the third author. We would like to thank the other members of the mathematical reasoning group for their feedback on this project

    A Subsumption Architecture for Theorem Proving?

    Get PDF
    Brooks has criticized traditional approaches to artificial intelligence as too ineffi- cient. In particular, he has singled out techniques involving search as inadequate to achieve the fast reaction times required by robots and other AI products that need to work in the real world. Instead he proposes the subsumption architecture as an overall organizing principle. This consists of layers of behavioural modules, each of which is capable of carrying out a complete (usually simple) task. He has employed this architecture to build a series of simple mobile robots, but he claims that it is appropriate for all AI products. Brooks's proposal is usually seen as an example of nouvelle AI, in contrast to good old-fashioned AI (GOFAI). Automatic theorem proving is the archetypal example of GOFAI. The resolution theorem proving technique once served as the engine of AI. Of all areas of AI it seems the most difficult to implement using Brooks's ideas. It would thus serve as a keen test of Brooks's proposal to explore to what extent the task of theorem proving can be achieved by a subsumption architecture. Tactics are programs for guiding a theorem prover. They were introduced as an efficient alternative to search-based techniques. In this paper I compare recent work on tactic-based theorem proving with Brooks's proposals and show that, surprisingly, there is a similarity between them. It thus seems that the distinction between nouvelle AI and GOFAI is not so great as is sometimes claimed. However, this exercise also identifies some criticisms of Brooks's proposal

    The design and implementation of a relational programming system.

    Get PDF
    The declarative class of computer languages consists mainly of two paradigms - the logic and the functional. Much research has been devoted in recent years to the integration of the two with the aim of securing the advantages of both without retaining their disadvantages. To date this research has, arguably, been less fruitful than initially hoped. A large number of composite functional/logical languages have been proposed but have generally been marred by the lack of a firm, cohesive, mathematical basis. More recently new declarative paradigms, equational and constraint languages, have been advocated. These however do not fully encompass those features we perceive as being central to functional and logic languages. The crucial functional features are higher-order definitions, static polymorphic typing, applicative expressions and laziness. The crucial logic features are ability to reason about both functional and non-functional relationships and to handle computations involving search. This thesis advocates a new declarative paradigm which lies midway between functional and logic languages - the so-called relational paradigm. In a relationallanguage program and data alike are denoted by relations. All expressions are relations constructed from simpler expressions using operators which form a relational algebra. The impetus for use of relations in a declarative language comes from observations concerning their connection to functional and logic programming. Relations are mathematically more general than functions modelling non-functional as well as functional relationships. They also form the basis of many logic languages, for example, Prolog. This thesis proposes a new relational language based entirely on binary relations, named Drusilla. We demonstrate the functional and logic aspects of Drusilla. It retains the higher-order objects and polymorphism found in modern functional languages but handles non-determinism and models relationships between objects in the manner of a logic language with notion of algorithm being composed of logic and control elements. Different programming styles - functional, logic and relational- are illustrated. However, such expressive power does not come for free; it has associated with it a high cost of implementation. Two main techniques are used in the necessarily complex language interpreter. A type inference system checks programs to ensure they are meaningful and simultaneously performs automatic representation selection for relations. A symbolic manipulation system transforms programs to improve. efficiency of expressions and to increase the number of possible representations for relations while preserving program meaning

    Predicting the approximate functional behaviour of physical systems

    Get PDF
    This dissertation addresses the problem of the computer prediction of the approximate behaviour of physical systems describable by ordinary differential equations.Previous approaches to behavioural prediction have either focused on an exact mathematical description or on a qualitative account. We advocate a middle ground: a representation more coarse than an exact mathematical solution yet more specific than a qualitative one. What is required is a mathematical expression, simpler than the exact solution, whose qualitative features mirror those of the actual solution and whose functional form captures the principal parameter relationships underlying the behaviour of the real system. We term such a representation an approximate functional solution.Approximate functional solutions are superior to qualitative descriptions because they reveal specific functional relationships, restore a quantitative time scale to a process and support more sophisticated comparative analysis queries. Moreover, they can be superior to exact mathematical solutions by emphasizing comprehensibility, adequacy and practical utility over precision.Two strategies for constructing approximate functional solutions are proposed. The first abstracts the original equation, predicts behaviour in the abstraction space and maps this back to the approximate functional level. Specifically, analytic abduction exploits qualitative simulation to predict the qualitative properties of the solution and uses this knowledge to guide the selection of a parameterized trial function which is then tuned with respect to the differential equation. In order to limit the complexity of a proposed approximate functional solution, and hence maintain its comprehensibility, back-of-the-envelope reasoning is used to simplify overly complex expressions in a magnitude extreme. If no function is recognised which matches the predicted behaviour, segment calculus is called upon to find a composite function built from known primitives and a set of operators. At the very least, segment calculus identifies a plausible structure for the form of the solution (e.g. that it is a composition of two unknown functions). Equation parsing capitalizes on this partial information to look for a set of termwise interactions which, when interpreted, expose a particular solution of the equation.The second, and more direct, strategy for constructing an approximate functional solution is embodied in the closed form approximation technique. This extends approximation methods to equations which lack a closed form solution. This involves solving the differential equation exactly, as an infinite series, and obtaining an approximate functional solution by constructing a closed form function whose Taylor series is close to that of the exact solutionThe above techniques dovetail together to achieve a style of reasoning closer to that of an engineer or physicist rather than a mathematician. The key difference being to sacrifice the goal of finding the correct solution of the differential equation in favour of finding an approximation which is adequate for the purpose to which the knowledge will be put. Applications to Intelligent Tutoring and Design Support Systems are suggested

    Proof Planning for Automating Hardware Verification

    Get PDF
    Centre for Intelligent Systems and their ApplicationsIn this thesis we investigate the applicability of proof planning to automate the verification of hardware systems. Proof planning is a meta-level reasoning technique which captures patterns of proof common to a family of theorems. It contributes to the automation of proof by incorporating and extending heuristics found in the Nqthm theorem prover and using them to guide a tactic-based theorem prover in the search for a proof. We have addressed the automation of proof for hardware verification from a proof planning perspective, and have applied the strategies and search control mechanisms of proof planning to generate automatically customised tactics which prove conjectures about the correctness of many types of circuits. The contributions of this research can be summarised as follows: (1) we show by experimentation the applicability of the proof planning ideas to verify automatically hardware designs;(2)we develop and use a methodology based on the concept of proof engineering using proof planning to verify various combinational and sequential circuits which include: arithmetic circuits (adders, subtracters, multipliers, dividers, factorials), data-path components arithmetic logic units shifters, processing units) and a simple microprocessor system; and (3) we contribute to the profiling of the Clam proof planning system by improving its robustness and efficiency in handling large terms and proofs. In verifying hardware, the user formalises a problem by writing the specification, the implementation and the conjecture, using a logic language, and asks Clam to compose a tactic to prove the conjecture. This tactic is then executed by the Oyster prover. To compose a tactic, Clam uses a set of methods which implement the heuristics that specify general-purpose tactics, and AI planning mechanisms. Search is controlled by a type of annotated rewriting called rippling, which controls the selective application of rewrite scaled wave rules. We have extended some of the Clam's methods to verify circuits.The size of the proofs were orders of magnitude larger than the proofs that had been attempted before with proof planning, and are comparable with similar verification proofs obtained by other systems but using fewer lemmas and less interaction. Proof engineering refers to the application of formal proof for system design and verification. We propose a proof engineering methodology which consists of partitioning the automation of formal proof into three different kind of tasks: user, proof and systems tasks.User tasks have to do with formalising a particular verification problem and using a formal tool to obtain a proof. Proof tasks refer to the tuning of proof techniques (e.g. methods and tactics)to help obtain a proof. Systems tasks have to do with the modification of a formal tool system. By making this distinction explicit, proof development is more manageable. We conjecture that our approach is widely applicable and can be integrated into formal verification environments to improve automation facilities, and be utilised to verify commercial and safety-critical hardware systems in industrial settings

    Investigation of design and execution alternatives for the committed choice non-deterministic logic languages

    Get PDF
    The general area of developing, applying and studying new and parallel models of computation is motivated by a need to overcome the limits of current Von Neumann based architectures. A key area of research in understanding how new technology can be applied to Al problem solving is through using logic languages. Logic programming languages provide a procedural interpretation for sentences of first order logic, mainly using a class of sentence called Horn clauses. Horn clauses are open to a wide variety of parallel evaluation models, giving possible speed-ups and alternative parallel models of execution. The research in this thesis is concerned with investigating one class of parallel logic language known as Committed Choice Non-Deterministic languages. The investigation considers the inherent parallel behaviour of Al programs implemented in the CCND languages and the effect of various alternatives open to language implementors and designers. This is achieved by considering how various Al programming techniques map to alternative language designs and the behaviour of these Al programs on alternative implementations of these languages. The aim of this work is to investigate how Al programming techniques are affected (qualitatively and quantitatively) by particular language features. The qualitative evaluation is a consideration of how Al programs can be mapped to the various CCND languages. The applications considered are general search algorithms (which focuses on the committed choice nature of the languages); chart parsing (which focuses on the differences between safe and unsafe languages); and meta-level inference (which focuses on the difference between deep and flat languages). The quantitative evaluation considers the inherent parallel behaviour of the resulting programs and the effect of possible implementation alternatives on this inherent behaviour. To carry out this quantitative evaluation we have implemented a system which improves on the current interpreter based evaluation systems. The new system has an improved model of execution and allows severa
    corecore