14 research outputs found

    The cycle contraction mapping theorem

    Get PDF
    This report lays the foundation for a theory of total correctness for programs not based upon termination. The Cycle Contraction Mapping Theorem is both an extension of Wadge's cycle sum theorem for Kahn data flow and a generalisation of Banach's contraction mapping theorem to a class of quasi metric spaces definable using the symmetric Partial Metric distance function. This work provides considerable evidence that it is possible after all to construct a metric theory for Scott style partial order domains

    The topology of partial metric spaces

    Get PDF
    The T0 world of Scott's topological models used in the denotational semantics of programming languages may at first sight appear to have nothing whatever in common with the Hausdorff world of metric space theory. Can this be true though when the notion of "distance" is so important in the application of inductive proof theory to recursive definitions? This paper shows that existing work on the application of quasi metrics to denotational semantics can be taken much further than just describing Scott topologies. Using our "partial metric" we introduce a new approach by constructing each semantic domain as an Alexandrov topology "sandwiched" between two metric topologies. To be presented at the Eighth Summer Conference on General Topology and Applications, June 18-20 1992, Queens College, New York City

    Abstract interpretation and indeterminacy

    Get PDF
    Journal ArticleWe present a semantic theory that allows us to discuss the semantics of indeterminate operators in a dataflow network. The assumption is made that the language in which the indeterminate operators are written has a construct that allows for the testing of availability of data on input lines. We then show that indeterminacy arises through the use of such an operator together with the fact that communication channels produce unpredictable delays in the transmission of data. Our scheme is to use special tokens called hiatons to obtain ordinary streams. This filtering process produces indeterminate behavior at the level of ordinary streams. We indicate how this can be justified using the formalism of abstract interpretation. We show that a particular fairness anomaly does not arise

    Tameness in generalized metric structures

    Full text link
    We broaden the framework of metric abstract elementary classes (mAECs) in several essential ways, chiefly by allowing the metric to take values in a well-behaved quantale. As a proof of concept we show that the result of Boney and Zambrano on (metric) tameness under a large cardinal assumption holds in this more general context. We briefly consider a further generalization to partial metric spaces, and hint at connections to classes of fuzzy structures, and structures on sheaves

    Partial metrizability in value quantales

    Full text link
    [EN] Partial metrics are metrics except that the distance from a point to itself need not be 0. These are useful in modelling partially defined information, which often appears in computer science. We generalize this notion to study “partial metrics” whose values lie in a value quantale which may be other than the reals. Then each topology arises from such a generalized metric, and for each continuous poset, there is such a generalized metric whose topology is the Scott topology, and whose dual topology is the lower topology. These are both corollaries to our result that a bitopological space is pairwise completely regular if and only if there is such a generalized metric whose topology is the first topology, and whose dual topology is the second.This author wishes to acknowledge support for this research from the EPSRC of the United Kingdom (grant GR/S07117/01), and from the City University of New York (PSCCUNY grant 64472-00 33).Kopperman, RD.; Matthews, S.; Pajoohesh, H. (2004). Partial metrizability in value quantales. Applied General Topology. 5(1):115-127. https://doi.org/10.4995/agt.2004.2000SWORD1151275

    A formal model of non-determinate dataflow computation

    Get PDF
    Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1983.MICROFICHE COPY AVAILABLE IN ARCHIVES AND ENGINEERINGVita.Bibliography: leaves 75-78.by Jarvis Dean Brock.Ph.D

    A specification-based design tool for artificial neural networks.

    Get PDF
    Wong Wai.Thesis (M.Phil.)--Chinese University of Hong Kong, 1992.Includes bibliographical references (leaves 78-80).Chapter 1. --- Introduction --- p.1Chapter 1.1. --- Specification Environment --- p.2Chapter 1.2. --- Specification Analysis --- p.2Chapter 1.3. --- Outline --- p.3Chapter 2. --- Survey --- p.4Chapter 2.1. --- Concurrence Specification --- p.4Chapter 2.1.1. --- Sequential Approach --- p.5Chapter 2.1.2. --- Mapping onto Concurrent Architecture --- p.6Chapter 2.1.3. --- Automatic Concurrence Introduction --- p.7Chapter 2.2. --- Specification Analysis --- p.8Chapter 2.2.1. --- Motivation --- p.8Chapter 2.2.2. --- Cyclic Dependency --- p.8Chapter 3. --- The Design Tool --- p.11Chapter 3.1. --- Specification Environment --- p.11Chapter 3.1.1. --- Framework --- p.11Chapter 3.1.1.1. --- Formal Neurons --- p.12Chapter 3.1.1.2. --- Configuration --- p.12Chapter 3.1.1.3. --- Control Neuron --- p.13Chapter 3.1.2. --- Dataflow Specification --- p.14Chapter 3.1.2.1. --- Absence of Control Information --- p.14Chapter 3.1.2.2. --- Single-Valued Variables & Explicit Time Indices --- p.14Chapter 3.1.2.3. --- Explicit Notations --- p.15Chapter 3.1.3. --- User Interface --- p.15Chapter 3.2. --- Specification Analysis --- p.16Chapter 3.2.1. --- Data Dependency Analysis --- p.16Chapter 3.2.2. --- Attribute Analysis --- p.16Chapter 4. --- BP-Net Specification --- p.18Chapter 4.1. --- BP-Net Paradigm --- p.18Chapter 4.1.1. --- Neurons of a BP-Net --- p.18Chapter 4.1.2. --- Configuration of BP-Net --- p.20Chapter 4.2. --- Constant Declarations --- p.20Chapter 4.3. --- Formal Neuron Specification --- p.21Chapter 4.3.1. --- Mapping the Paradigm --- p.22Chapter 4.3.1.1. --- Mapping Symbols onto Parameter Names --- p.22Chapter 4.3.1.2. --- Mapping Neuron Equations onto Internal Functions --- p.22Chapter 4.3.2. --- Form Entries --- p.23Chapter 4.3.2.1. --- Neuron Type Entry --- p.23Chapter 4.3.2.2. --- "Input, Output and Internal Parameter Entries" --- p.23Chapter 4.3.2.3. --- Initial Value Entry --- p.25Chapter 4.3.2.4. --- Internal Function Entry --- p.25Chapter 4.4. --- Configuration Specification --- p.28Chapter 4.4.1. --- Fonn Entries --- p.29Chapter 4.4.1.1. --- Neuron Label Entry --- p.29Chapter 4.4.1.2. --- Neuron Character Entry --- p.30Chapter 4.4.1.3. --- Connection Pattern Entry --- p.31Chapter 4.4.2. --- Characteristics of the Syntax --- p.33Chapter 4.5. --- Control Neuron Specification --- p.34Chapter 4.5.1. --- Form Entries --- p.35Chapter 4.5.1.1. --- "Global Input, Output, Parameter & Initial Value Entries" --- p.35Chapter 4.5.1.2. --- Input & Output File Entries --- p.36Chapter 4.5.1.3. --- Global Function Entry --- p.36Chapter 5. --- Data Dependency Analysis_ --- p.40Chapter 5.1. --- Graph Construction --- p.41Chapter 5.1.1. --- Simplification and Normalization --- p.41Chapter 5.1.1.1. --- Removing Non-Esscntial Information --- p.41Chapter 5.1.1.2. --- Removing File Record Parameters --- p.42Chapter 5.1.1.3. --- Rearranging Temporal offset --- p.42Chapter 5.1.1.4. --- Conservation of Temporal Relationship --- p.43Chapter 5.1.1.5. --- Zero/Negative Offset for Determining Parameters --- p.43Chapter 5.1.2. --- Internal Dependency Graphs (IDGs) --- p.43Chapter 5.1.3. --- IDG of Control Neuron (CnIDG) --- p.45Chapter 5.1.4. --- Global Dependency Graphs (GDGs) --- p.45Chapter 5.2. --- Cycle Detection --- p.48Chapter 5.2.1. --- BP-Net --- p.48Chapter 5.2.2. --- Other Examples --- p.49Chapter 5.2.2.1. --- The Perceptron --- p.50Chapter 5.2.2.2. --- The Boltzmann Machinc --- p.51Chapter 5.2.3. --- Number of Cycles --- p.52Chapter 5.2.3.1. --- Different Number of Layers --- p.52Chapter 5.2.3.2. --- Different Network Types --- p.52Chapter 5.2.4. --- Cycle Length --- p.53Chapter 5.2.4.1. --- Different Number of Layers --- p.53Chapter 5.2.4.2. --- Comparison Among Different Networks --- p.53Chapter 5.2.5. --- Difficulties in Analysis --- p.53Chapter 5.3. --- Dependency Cycle Analysis --- p.54Chapter 5.3.1. --- Temporal Index Analysis --- p.54Chapter 5.3.2. --- Non-Temporal Index Analysis --- p.55Chapter 5.3.2.1. --- A Simple Example --- p.55Chapter 5.3.2.2. --- Single Parameter --- p.56Chapter 5.3.2.3. --- Multiple Parameters --- p.57Chapter 5.3.3. --- Combined Method --- p.58Chapter 5.3.4. --- Scheduling --- p.58Chapter 5.3.4.1. --- Algorithm --- p.59Chapter 5.3.4.2. --- Schedule for the BP-Net --- p.59Chapter 5.4. --- Symmetry in Graph Construction --- p.60Chapter 5.4.1. --- Basic Approach --- p.60Chapter 5.4.2. --- Construction of the BP-Net GDG --- p.61Chapter 5.4.3. --- Limitation --- p.63Chapter 6. --- Attribute Analysis__ --- p.64Chapter 6.1. --- Parameter Analysis --- p.64Chapter 6.1.1. --- Internal Dependency Graphs (IDGs) --- p.65Chapter 6.1.1.1. --- Correct Properties of Parameters in IDGs --- p.65Chapter 6.1.1.2. --- Example --- p.65Chapter 6.1.2. --- Combined Internal Dependency Graphs (CIDG) --- p.66Chapter 6.1.2.1. --- Tests on Parameters of CIDG --- p.66Chapter 6.1.2.2. --- Example --- p.67Chapter 6.1.3. --- Finalized Neuron Obtained --- p.67Chapter 6.1 4. --- CIDG of the BP-Net --- p.68Chapter 6.2. --- Constraint Checking --- p.68Chapter 6.2.1. --- "Syntactic, Semantic and Simple Checkings" --- p.68Chapter 6.2.1.1. --- The Syntactic & Semantic Techniques --- p.68Chapter 6.2.1.2. --- Simple Matching --- p.70Chapter 6.2.2. --- Constraints --- p.71Chapter 6.2.2.1. --- Constraints on Formal Neuron --- p.71Chapter 6.2.2.2. --- Constraints on Configuration --- p.72Chapter 6.2.2.3. --- Constraints on Control Neuron --- p.73Chapter 6.3. --- Complete Checking Procedure --- p.73Chapter 7. --- Conclusions_ --- p.75Chapter 7.1. --- Limitations --- p.76Chapter 7.1.1. --- Exclusive Conditional Dependency Cycles --- p.76Chapter 7.1.2. --- Maximum Parallelism --- p.77Reference --- p.78Appendix --- p.1Chapter I. --- Form Syntax --- p.1Chapter A. --- Syntax Conventions --- p.1Chapter B. --- Form Definition --- p.1Chapter 1. --- Form Structure --- p.1Chapter 2. --- Constant Declaration --- p.1Chapter 3. --- Formal Neuron Declaration --- p.1Chapter 4. --- Configuration Declaration --- p.2Chapter 5. --- Control Neuron --- p.2Chapter 6. --- Supplementary Definition --- p.3Chapter II. --- Algorithms --- p.4Chapter III. --- Deadlock & Dependency Cycles --- p.14Chapter A. --- Deadlock Prevention --- p.14Chapter 1. --- Necessary Conditions for Deadlock --- p.14Chapter 2. --- Resource Allocation Graphs --- p.15Chapter 3. --- Cycles and Blocked Requests --- p.15Chapter B. --- Deadlock in ANN Systems --- p.16Chapter 1. --- Shared resources --- p.16Chapter 2. --- Presence of the Necessary Conditions for Deadlocks --- p.16Chapter 3. --- Operation Constraint for Communication --- p.16Chapter 4. --- Checkings Required --- p.17Chapter C. --- Data Dependency Graphs --- p.17Chapter 1. --- Simplifying Resource Allocation Graphs --- p.17Chapter 2. --- Expanding into Parameter Level --- p.18Chapter 3. --- Freezing the Request Edges --- p.18Chapter 4. --- Reversing the Edge Directions --- p.18Chapter 5. --- Mutual Dependency Cycles --- p.18Chapter IV. --- Case Studies --- p.19Chapter A. --- BP-Net --- p.19Chapter 1. --- Specification Forms --- p.19Chapter 2. --- Results After Simple Checkings --- p.21Chapter 3. --- Internal Dependency Graphs Construction --- p.21Chapter 4. --- Results From Parameter Analysis --- p.21Chapter 5. --- Global Dependency Graphs Construction --- p.21Chapter 6. --- Cycles Detection --- p.21Chapter 7. --- Time Subscript Analysis --- p.21Chapter 8. --- Subscript Analysis --- p.21Chapter 9. --- Scheduling --- p.21Chapter B. --- Perceptron --- p.21Chapter 1. --- Specification Forms --- p.22Chapter 2. --- Results After Simple Checkings --- p.24Chapter 3. --- Internal Dependency Graphs Construction --- p.24Chapter 4. --- Results From Parameter Analysis --- p.25Chapter 5. --- Global Dependency Graph Construction --- p.25Chapter 6. --- Cycles Detection --- p.25Chapter 7. --- Time Subscript Analysis --- p.25Chapter 8. --- Subscript Analysis --- p.25Chapter 9. --- Scheduling --- p.25Chapter C. --- Boltzmann Machine --- p.26Chapter 1. --- Specification Forms --- p.26Chapter 2. --- Results After Simple Checkings --- p.35Chapter 3. --- Graphs Construction --- p.35Chapter 4. --- Results From Parameter Analysis --- p.36Chapter 5. --- Global Dependency Graphs Construction --- p.36Chapter 6. --- Cycle Detection --- p.36Chapter 7. --- Time Subscript Analysis --- p.36Chapter 8. --- Subscript Analysis --- p.36Chapter 9. --- Scheduling --- p.3

    An extensional treatment of dataflow deadlock

    Get PDF
    We discuss deadlock in reference to a simple equation data-flow language, and devise a test (the cycle sum test) which is applied to the dependency graph of a Program we use Kahn's extensional semantics of data-flow and give a purely extensional (non operational) proof that no program passing the cycle sum test can ever deadlock. The proof is based on the notions of size (length) and completeness in the domain of histories, and should extend to a much widen context
    corecore