3,405 research outputs found

    A Divergence Critic for Inductive Proof

    Full text link
    Inductive theorem provers often diverge. This paper describes a simple critic, a computer program which monitors the construction of inductive proofs attempting to identify diverging proof attempts. Divergence is recognized by means of a ``difference matching'' procedure. The critic then proposes lemmas and generalizations which ``ripple'' these differences away so that the proof can go through without divergence. The critic enables the theorem prover Spike to prove many theorems completely automatically from the definitions alone.Comment: See http://www.jair.org/ for any accompanying file

    Higher Order Unification via Explicit Substitutions

    Get PDF
    AbstractHigher order unification is equational unification for βη-conversion. But it is not first order equational unification, as substitution has to avoid capture. Thus, the methods for equational unification (such as narrowing) built upon grafting (i.e., substitution without renaming) cannot be used for higher order unification, which needs specific algorithms. Our goal in this paper is to reduce higher order unification to first order equational unification in a suitable theory. This is achieved by replacing substitution by grafting, but this replacement is not straightforward as it raises two major problems. First, some unification problems have solutions with grafting but no solution with substitution. Then equational unification algorithms rest upon the fact that grafting and reduction commute. But grafting and βη-reduction do not commute in λ-calculus and reducing an equation may change the set of its solutions. This difficulty comes from the interaction between the substitutions initiated by βη-reduction and the ones initiated by the unification process. Two kinds of variables are involved: those of βη-conversion and those of unification. So, we need to set up a calculus which distinguishes between these two kinds of variables and such that reduction and grafting commute. For this purpose, the application of a substitution of a reduction variable to a unification one must be delayed until this variable is instantiated. Such a separation and delay are provided by a calculus of explicit substitutions. Unification in such a calculus can be performed by well-known algorithms such as narrowing, but we present a specialised algorithm for greater efficiency. At last we show how to relate unification in λ-calculus and in a calculus with explicit substitutions. Thus, we come up with a new higher order unification algorithm which eliminates some burdens of the previous algorithms, in particular the functional handling of scopes. Huet's algorithm can be seen as a specific strategy for our algorithm, since each of its steps can be decomposed into elementary ones, leading to a more atomic description of the unification process. Also, solved forms in λ-calculus can easily be computed from solved forms in λσ-calculus

    NATURAL DEDUCTION AS HIGHER-ORDER RESOLUTION

    Get PDF
    An interactive theorem prover, Isabelle, is under development. In LCF, each inference rule is represented by one function for forwards proof and another (a tactic) for backwards proof. In Isabelle, each inference rule is represented by a Horn clause. Resolution gives both forwards and backwards proof, supporting a large class of logics. Isabelle has been used to prove theorems in Martin-L\"of's Constructive Type Theory. Quantifiers pose several difficulties: substitution, bound variables, Skolemization. Isabelle's representation of logical syntax is the typed lambda-calculus, requiring higher- order unification. It may have potential for logic programming. Depth-first subgoaling along inference rules constitutes a higher-order Prolog

    Attribute value phonology

    Get PDF
    SIGLEAvailable from British Library Document Supply Centre- DSC:D93955 / BLDSC - British Library Document Supply CentreGBUnited Kingdo

    On the Localized superluminal Solutions to the Maxwell Equations

    Full text link
    In the first part of this article the various experimental sectors of physics in which Superluminal motions seem to appear are briefly mentioned, after a sketchy theoretical introduction. In particular, a panoramic view is presented of the experiments with evanescent waves (and/or tunneling photons), and with the "Localized superluminal Solutions" (SLS) to the wave equation, like the so-called X-shaped waves. In the second part of this paper we present a series of new SLSs to the Maxwell equations, suitable for arbitrary frequencies and arbitrary bandwidths: some of them being endowed with finite total energy. Among the others, we set forth an infinite family of generalizations of the classic X-shaped wave; and show how to deal with the case of a dispersive medium. Results of this kind may find application in other fields in which an essential role is played by a wave-equation (like acoustics, seismology, geophysics, gravitation, elementary particle physics, etc.). This e-print, in large part a review, was prepared for the special issue on "Nontraditional Forms of Light" of the IEEE JSTQE (2003); and a preliminary version of it appeared as Report NSF-ITP-02-93 (KITP, UCSB; 2002). Further material can be found in the recent e-prints arXiv:0708.1655v2 [physics.gen-ph] and arXiv:0708.1209v1 [physics.gen-ph]. The case of the very interesting (and more orthodox, in a sense) subluminal Localized Waves, solutions to the wave equations, will be dealt with in a coming paper. [Keywords: Wave equation; Wave propagation; Localized solutions to Maxwell equations; Superluminal waves; Bessel beams; Limited-dispersion beams; Electromagnetic wavelets; X-shaped waves; Finite-energy beams; Optics; Electromagnetism; Microwaves; Special relativity]Comment: LaTeX paper of 37 pages, with 20 Figures in jpg [to be processed by PDFlatex

    A Field Guide to Genetic Programming

    Get PDF
    xiv, 233 p. : il. ; 23 cm.Libro ElectrónicoA Field Guide to Genetic Programming (ISBN 978-1-4092-0073-4) is an introduction to genetic programming (GP). GP is a systematic, domain-independent method for getting computers to solve problems automatically starting from a high-level statement of what needs to be done. Using ideas from natural evolution, GP starts from an ooze of random computer programs, and progressively refines them through processes of mutation and sexual recombination, until solutions emerge. All this without the user having to know or specify the form or structure of solutions in advance. GP has generated a plethora of human-competitive results and applications, including novel scientific discoveries and patentable inventions. The authorsIntroduction -- Representation, initialisation and operators in Tree-based GP -- Getting ready to run genetic programming -- Example genetic programming run -- Alternative initialisations and operators in Tree-based GP -- Modular, grammatical and developmental Tree-based GP -- Linear and graph genetic programming -- Probalistic genetic programming -- Multi-objective genetic programming -- Fast and distributed genetic programming -- GP theory and its applications -- Applications -- Troubleshooting GP -- Conclusions.Contents xi 1 Introduction 1.1 Genetic Programming in a Nutshell 1.2 Getting Started 1.3 Prerequisites 1.4 Overview of this Field Guide I Basics 2 Representation, Initialisation and GP 2.1 Representation 2.2 Initialising the Population 2.3 Selection 2.4 Recombination and Mutation Operators in Tree-based 3 Getting Ready to Run Genetic Programming 19 3.1 Step 1: Terminal Set 19 3.2 Step 2: Function Set 20 3.2.1 Closure 21 3.2.2 Sufficiency 23 3.2.3 Evolving Structures other than Programs 23 3.3 Step 3: Fitness Function 24 3.4 Step 4: GP Parameters 26 3.5 Step 5: Termination and solution designation 27 4 Example Genetic Programming Run 4.1 Preparatory Steps 29 4.2 Step-by-Step Sample Run 31 4.2.1 Initialisation 31 4.2.2 Fitness Evaluation Selection, Crossover and Mutation Termination and Solution Designation Advanced Genetic Programming 5 Alternative Initialisations and Operators in 5.1 Constructing the Initial Population 5.1.1 Uniform Initialisation 5.1.2 Initialisation may Affect Bloat 5.1.3 Seeding 5.2 GP Mutation 5.2.1 Is Mutation Necessary? 5.2.2 Mutation Cookbook 5.3 GP Crossover 5.4 Other Techniques 32 5.5 Tree-based GP 39 6 Modular, Grammatical and Developmental Tree-based GP 47 6.1 Evolving Modular and Hierarchical Structures 47 6.1.1 Automatically Defined Functions 48 6.1.2 Program Architecture and Architecture-Altering 50 6.2 Constraining Structures 51 6.2.1 Enforcing Particular Structures 52 6.2.2 Strongly Typed GP 52 6.2.3 Grammar-based Constraints 53 6.2.4 Constraints and Bias 55 6.3 Developmental Genetic Programming 57 6.4 Strongly Typed Autoconstructive GP with PushGP 59 7 Linear and Graph Genetic Programming 61 7.1 Linear Genetic Programming 61 7.1.1 Motivations 61 7.1.2 Linear GP Representations 62 7.1.3 Linear GP Operators 64 7.2 Graph-Based Genetic Programming 65 7.2.1 Parallel Distributed GP (PDGP) 65 7.2.2 PADO 67 7.2.3 Cartesian GP 67 7.2.4 Evolving Parallel Programs using Indirect Encodings 68 8 Probabilistic Genetic Programming 8.1 Estimation of Distribution Algorithms 69 8.2 Pure EDA GP 71 8.3 Mixing Grammars and Probabilities 74 9 Multi-objective Genetic Programming 75 9.1 Combining Multiple Objectives into a Scalar Fitness Function 75 9.2 Keeping the Objectives Separate 76 9.2.1 Multi-objective Bloat and Complexity Control 77 9.2.2 Other Objectives 78 9.2.3 Non-Pareto Criteria 80 9.3 Multiple Objectives via Dynamic and Staged Fitness Functions 80 9.4 Multi-objective Optimisation via Operator Bias 81 10 Fast and Distributed Genetic Programming 83 10.1 Reducing Fitness Evaluations/Increasing their Effectiveness 83 10.2 Reducing Cost of Fitness with Caches 86 10.3 Parallel and Distributed GP are Not Equivalent 88 10.4 Running GP on Parallel Hardware 89 10.4.1 Master–slave GP 89 10.4.2 GP Running on GPUs 90 10.4.3 GP on FPGAs 92 10.4.4 Sub-machine-code GP 93 10.5 Geographically Distributed GP 93 11 GP Theory and its Applications 97 11.1 Mathematical Models 98 11.2 Search Spaces 99 11.3 Bloat 101 11.3.1 Bloat in Theory 101 11.3.2 Bloat Control in Practice 104 III Practical Genetic Programming 12 Applications 12.1 Where GP has Done Well 12.2 Curve Fitting, Data Modelling and Symbolic Regression 12.3 Human Competitive Results – the Humies 12.4 Image and Signal Processing 12.5 Financial Trading, Time Series, and Economic Modelling 12.6 Industrial Process Control 12.7 Medicine, Biology and Bioinformatics 12.8 GP to Create Searchers and Solvers – Hyper-heuristics xiii 12.9 Entertainment and Computer Games 127 12.10The Arts 127 12.11Compression 128 13 Troubleshooting GP 13.1 Is there a Bug in the Code? 13.2 Can you Trust your Results? 13.3 There are No Silver Bullets 13.4 Small Changes can have Big Effects 13.5 Big Changes can have No Effect 13.6 Study your Populations 13.7 Encourage Diversity 13.8 Embrace Approximation 13.9 Control Bloat 13.10 Checkpoint Results 13.11 Report Well 13.12 Convince your Customers 14 Conclusions Tricks of the Trade A Resources A.1 Key Books A.2 Key Journals A.3 Key International Meetings A.4 GP Implementations A.5 On-Line Resources 145 B TinyGP 151 B.1 Overview of TinyGP 151 B.2 Input Data Files for TinyGP 153 B.3 Source Code 154 B.4 Compiling and Running TinyGP 162 Bibliography 167 Inde

    Fourth NASA Langley Formal Methods Workshop

    Get PDF
    This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS
    corecore