487 research outputs found

    Global semantic typing for inductive and coinductive computing

    Get PDF
    Inductive and coinductive types are commonly construed as ontological (Church-style) types, denoting canonical data-sets such as natural numbers, lists, and streams. For various purposes, notably the study of programs in the context of global semantics, it is preferable to think of types as semantical properties (Curry-style). Intrinsic theories were introduced in the late 1990s to provide a purely logical framework for reasoning about programs and their semantic types. We extend them here to data given by any combination of inductive and coinductive definitions. This approach is of interest because it fits tightly with syntactic, semantic, and proof theoretic fundamentals of formal logic, with potential applications in implicit computational complexity as well as extraction of programs from proofs. We prove a Canonicity Theorem, showing that the global definition of program typing, via the usual (Tarskian) semantics of first-order logic, agrees with their operational semantics in the intended model. Finally, we show that every intrinsic theory is interpretable in a conservative extension of first-order arithmetic. This means that quantification over infinite data objects does not lead, on its own, to proof-theoretic strength beyond that of Peano Arithmetic. Intrinsic theories are perfectly amenable to formulas-as-types Curry-Howard morphisms, and were used to characterize major computational complexity classes Their extensions described here have similar potential which has already been applied

    Planning for behaviour-based robotic assembly: a logical framework

    Get PDF

    Progress Report : 1991 - 1994

    Get PDF

    Extending the Real-Time Maude Semantics of Ptolemy to Hierarchical DE Models

    Full text link
    This paper extends our Real-Time Maude formalization of the semantics of flat Ptolemy II discrete-event (DE) models to hierarchical models, including modal models. This is a challenging task that requires combining synchronous fixed-point computations with hierarchical structure. The synthesis of a Real-Time Maude verification model from a Ptolemy II DE model, and the formal verification of the synthesized model in Real-Time Maude, have been integrated into Ptolemy II, enabling a model-engineering process that combines the convenience of Ptolemy II DE modeling and simulation with formal verification in Real-Time Maude.Comment: In Proceedings RTRTS 2010, arXiv:1009.398

    Checking Zenon Modulo Proofs in Dedukti

    Get PDF
    Dedukti has been proposed as a universal proof checker. It is a logical framework based on the lambda Pi calculus modulo that is used as a backend to verify proofs coming from theorem provers, especially those implementing some form of rewriting. We present a shallow embedding into Dedukti of proofs produced by Zenon Modulo, an extension of the tableau-based first-order theorem prover Zenon to deduction modulo and typing. Zenon Modulo is applied to the verification of programs in both academic and industrial projects. The purpose of our embedding is to increase the confidence in automatically generated proofs by separating untrusted proof search from trusted proof verification.Comment: In Proceedings PxTP 2015, arXiv:1507.0837

    CafeOBJ: Logical Foundations and Methodologies

    Get PDF
    CafeOBJ is an executable industrial strength multi-logic algebraic specification language which is a modern successor of OBJ and incorporates several new algebraic specification paradigms. In this paper we survey its logical foundations and present some of its methodologies

    Rule-based Methodologies for the Specification and Analysis of Complex Computing Systems

    Full text link
    Desde los orígenes del hardware y el software hasta la época actual, la complejidad de los sistemas de cálculo ha supuesto un problema al cual informáticos, ingenieros y programadores han tenido que enfrentarse. Como resultado de este esfuerzo han surgido y madurado importantes áreas de investigación. En esta disertación abordamos algunas de las líneas de investigación actuales relacionada con el análisis y la verificación de sistemas de computación complejos utilizando métodos formales y lenguajes de dominio específico. En esta tesis nos centramos en los sistemas distribuidos, con un especial interés por los sistemas Web y los sistemas biológicos. La primera parte de la tesis está dedicada a aspectos de seguridad y técnicas relacionadas, concretamente la certificación del software. En primer lugar estudiamos sistemas de control de acceso a recursos y proponemos un lenguaje para especificar políticas de control de acceso que están fuertemente asociadas a bases de conocimiento y que proporcionan una descripción sensible a la semántica de los recursos o elementos a los que se accede. También hemos desarrollado un marco novedoso de trabajo para la Code-Carrying Theory, una metodología para la certificación del software cuyo objetivo es asegurar el envío seguro de código en un entorno distribuido. Nuestro marco de trabajo está basado en un sistema de transformación de teorías de reescritura mediante operaciones de plegado/desplegado. La segunda parte de esta tesis se concentra en el análisis y la verificación de sistemas Web y sistemas biológicos. Proponemos un lenguaje para el filtrado de información que permite la recuperación de informaciones en grandes almacenes de datos. Dicho lenguaje utiliza información semántica obtenida a partir de ontologías remotas para re nar el proceso de filtrado. También estudiamos métodos de validación para comprobar la consistencia de contenidos web con respecto a propiedades sintácticas y semánticas. Otra de nuestras contribuciones es la propuesta de un lenguaje que permite definir y comprobar automáticamente restricciones semánticas y sintácticas en el contenido estático de un sistema Web. Finalmente, también consideramos los sistemas biológicos y nos centramos en un formalismo basado en lógica de reescritura para el modelado y el análisis de aspectos cuantitativos de los procesos biológicos. Para evaluar la efectividad de todas las metodologías propuestas, hemos prestado especial atención al desarrollo de prototipos que se han implementado utilizando lenguajes basados en reglas.Baggi ., M. (2010). Rule-based Methodologies for the Specification and Analysis of Complex Computing Systems [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/8964Palanci

    Term rewriting systems from Church-Rosser to Knuth-Bendix and beyond

    Get PDF
    Term rewriting systems are important for computability theory of abstract data types, for automatic theorem proving, and for the foundations of functional programming. In this short survey we present, starting from first principles, several of the basic notions and facts in the area of term rewriting. Our treatment, which often will be informal, covers abstract rewriting, Combinatory Logic, orthogonal systems, strategies, critical pair completion, and some extended rewriting formats
    corecore