85 research outputs found

    Foundational (co)datatypes and (co)recursion for higher-order logic

    Get PDF
    We describe a line of work that started in 2011 towards enriching Isabelle/HOL's language with coinductive datatypes, which allow infinite values, and with a more expressive notion of inductive datatype than previously supported by any system based on higher-order logic. These (co)datatypes are complemented by definitional principles for (co)recursive functions and reasoning principles for (co)induction. In contrast with other systems offering codatatypes, no additional axioms or logic extensions are necessary with our approach

    TWAM: A Certifying Abstract Machine for Logic Programs

    Full text link
    Type-preserving (or typed) compilation uses typing derivations to certify correctness properties of compilation. We have designed and implemented a type-preserving compiler for a simply-typed dialect of Prolog we call T-Prolog. The crux of our approach is a new certifying abstract machine which we call the Typed Warren Abstract Machine (TWAM). The TWAM has a dependent type system strong enough to specify the semantics of a logic program in the logical framework LF. We present a soundness metatheorem which constitutes a partial correctness guarantee: well-typed programs implement the logic program specified by their type. This metatheorem justifies our design and implementation of a certifying compiler from T-Prolog to TWAM.Comment: 41 pages, under submission to ACM Transactions on Computational Logi

    Refinement Types for Logical Frameworks and Their Interpretation as Proof Irrelevance

    Full text link
    Refinement types sharpen systems of simple and dependent types by offering expressive means to more precisely classify well-typed terms. We present a system of refinement types for LF in the style of recent formulations where only canonical forms are well-typed. Both the usual LF rules and the rules for type refinements are bidirectional, leading to a straightforward proof of decidability of typechecking even in the presence of intersection types. Because we insist on canonical forms, structural rules for subtyping can now be derived rather than being assumed as primitive. We illustrate the expressive power of our system with examples and validate its design by demonstrating a precise correspondence with traditional presentations of subtyping. Proof irrelevance provides a mechanism for selectively hiding the identities of terms in type theories. We show that LF refinement types can be interpreted as predicates using proof irrelevance, establishing a uniform relationship between two previously studied concepts in type theory. The interpretation and its correctness proof are surprisingly complex, lending support to the claim that refinement types are a fundamental construct rather than just a convenient surface syntax for certain uses of proof irrelevance

    A Narrative Study of Emotions Associated with Negative Childhood Experiences Reported in the Adult Attachment Interview

    Get PDF
    Attachment patterns, which tend to be stable over time, are passed from one generation to the next. Secure attachment has been linked to adaptive social functioning and has been identified as a protective factor against mental illness. The parents’ state of mind with regard to attachment—as measured with the Adult Attachment Interview (AAI) (Main, Goldwyn, & Hesse, 2002)—predicts the attachment classification for the infant in Ainsworth’s Strange Situation Procedure (Ainsworth, Blehar, Waters, & Wall, 1978). Earned-secure individuals have overcome negative childhood experiences to achieve a secure state of mind in adulthood. Earned security, like continuous security, strongly predicts infant security in the next generation. Preoccupied anger is one of the main constructs measured in the AAI that may lead to classification of an insecure, preoccupied state of mind. The current study was an analysis of the narratives of eight individuals whose AAIs indicated mild to high scores for preoccupied anger. All of these individuals have spent considerable energy and resources in grappling with negative childhood experiences. Participants were interviewed regarding how their feelings changed over time and what, if any, events contributed to how their feelings changed. For most participants, the emergence of sustained subjective anger was reported in late adolescence, or even adulthood. Those whose transcripts were judged earned-secure at the time of the study were associated with narratives that indicated progressive gains in Hoffman’s (2008) stages of empathy and Perry’s (1968) scheme for intellectual and ethical development. Reappraisal was identified as a key emotional regulation strategy that contributed to security. Supports for executive function also featured as important factors in the attainment of therapeutic goals. Attachment researchers may be especially interested that Hoffman’s stages emerged as a possible link between metacognitive processes for earned- and continuous-secure individuals alike. In contrast, the study’s findings regarding integrative processes associated with post-formal cognitive development, and mediators for implicit learning as predictors of behavior, suggest that earned security may be a different construct from continuous security. The results of this study hold important implications for treatment and social policy. The electronic version of this dissertation is at OhioLink ETD Center, www.ohiolink.edu/e

    The mathematicization of nature.

    Get PDF
    This thesis defends the Quine-Putnam indispensability argument for mathematical realism and introduces a new indispensability argument for a substantial conception of truth. Chapters 1 and 2 formulate the main components of the Quine-Putnam argument, namely that virtually all scientific laws quantify over mathematical entities and thus logically presuppose the existence thereof. Chapter 2 contains a detailed discussion of the logical structure of some scientific theories that incorporate or apply mathematics. Chapter 3 then reconstructs the central assumptions of Quine's argument, concluding (provocatively) that "science entails platonism". Chapter 4 contains a brief discussion of some major theories of truth, including deflationary views (redundancy, disquotation). Chapter 5 introduces a new argument against such deflationary views, based on certain logical properties of truth theories. Chapter 6 contains a further discussion of mathematical truth. In particular, non-standard conceptions of mathematical truth such as "if-thenism" and "hermeneuticism". Chapter 7 introduces the programmes of reconstrual and reconstruction proposed by recent nominalism. Chapters 8 discusses modal nominalism, concluding that modalism is implausible as an interpretation of mathematics (if taken seriously, it suffers from exactly those epistemological problems allegedly suffered by realism). Chapter 9 discusses Field's deflationism, whose central motivating idea is that mathematics is (pace Quine and Putnam) dispensable in applications. This turns on a conservativeness claim which, as Shapiro pointed out in 1983, must be incorrect (using Godel's Theorems). I conclude in Chapter 10 that nominalistic views of mathematics and deflationist views of truth are both inadequate to the overall explanatory needs of science

    2017-2018 Boise State University Undergraduate Catalog

    Get PDF
    This catalog is primarily for and directed at students. However, it serves many audiences, such as high school counselors, academic advisors, and the public. In this catalog you will find an overview of Boise State University and information on admission, registration, grades, tuition and fees, financial aid, housing, student services, and other important policies and procedures. However, most of this catalog is devoted to describing the various programs and courses offered at Boise State

    Safe Compositional Network Sketches: The Formal Framework

    Full text link
    NetSketch is a tool for the specification of constrained-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. As a modeling tool, it enables the abstraction of an existing system while retaining sufficient information about it to carry out future analysis of safety properties. As a design tool, NetSketch enables the exploration of alternative safe designs as well as the identification of minimal requirements for outsourced subsystems. NetSketch embodies a lightweight formal verification philosophy, whereby the power (but not the heavy machinery) of a rigorous formalism is made accessible to users via a friendly interface. NetSketch does so by exposing tradeoffs between exactness of analysis and scalability, and by combining traditional whole-system analysis with a more flexible compositional analysis. The compositional analysis is based on a strongly-typed Domain-Specific Language (DSL) for describing and reasoning about constrained-flow networks at various levels of sketchiness along with invariants that need to be enforced thereupon. In this paper, we define the formal system underlying the operation of NetSketch, in particular the DSL behind NetSketch's user-interface when used in "sketch mode", and prove its soundness relative to appropriately-defined notions of validity. In a companion paper [6], we overview NetSketch, highlight its salient features, and illustrate how it could be used in two applications: the management/shaping of traffic flows in a vehicular network (as a proxy for CPS applications) and in a streaming media network (as a proxy for Internet applications).National Science Foundation (CNS-0952145, CCF-0820138, CSR-0720604, EFRI-0735974
    • …
    corecore