247,886 research outputs found

    A Type-Theoretic Approach to Structural Resolution

    Full text link
    Structural resolution (or S-resolution) is a newly proposed alternative to SLD-resolution that allows a systematic separation of derivations into term-matching and unification steps. Productive logic programs are those for which term-matching reduction on any query must terminate. For productive programs with coinductive meaning, finite term-rewriting reductions can be seen as measures of observation in an infinite derivation. Ability of handling corecursion in a productive way is an attractive computational feature of S-resolution. In this paper, we make first steps towards a better conceptual understanding of operational properties of S-resolution as compared to SLD-resolution. To this aim, we propose a type system for the analysis of both SLD-resolution and S-resolution. We formulate S-resolution and SLD-resolution as reduction systems, and show their soundness relative to the type system. One of the central methods of this paper is realizability transformation, which makes logic programs productive and non-overlapping. We show that S-resolution and SLD-resolution are only equivalent for programs with these two properties.Comment: LOPSTR 201

    Interorganizational Policy Studies: Lessons Drawn from Implementation Research

    Get PDF
    Contingency approaches to organizing suggest that policy objectives are more likely to be achieved if the structures employed for implementation mesh with the policy objectives being sought. Interorganizational arrangements are used increasingly in carrying out public programs, and contingency logic can be used to assess the degree of match between policy objective and interunit structure. Such a perspective would seem to offer an approach of practical significance. Here the contingency logic as applied to interorganizational implementation is reviewed and its assumptions identified. To probe these assumptions, empirical evidence is analyzed from one policy sector which would seem especially promising. The findings suggest that even under highly favorable conditions, a contingency perspective provides only limited help. The research demonstrates the need for additional conceptual clarification and theoretical care in reaching conclusions about the impact of interorganizational structures on policy settings

    Specifying Logic Programs in Controlled Natural Language

    Full text link
    Writing specifications for computer programs is not easy since one has to take into account the disparate conceptual worlds of the application domain and of software development. To bridge this conceptual gap we propose controlled natural language as a declarative and application-specific specification language. Controlled natural language is a subset of natural language that can be accurately and efficiently processed by a computer, but is expressive enough to allow natural usage by non-specialists. Specifications in controlled natural language are automatically translated into Prolog clauses, hence become formal and executable. The translation uses a definite clause grammar (DCG) enhanced by feature structures. Inter-text references of the specification, e.g. anaphora, are resolved with the help of discourse representation theory (DRT). The generated Prolog clauses are added to a knowledge base. We have implemented a prototypical specification system that successfully processes the specification of a simple automated teller machine.Comment: 16 pages, compressed, uuencoded Postscript, published in Proceedings CLNLP 95, COMPULOGNET/ELSNET/EAGLES Workshop on Computational Logic for Natural Language Processing, Edinburgh, April 3-5, 199

    Rationalizing Racism: Arizona Representatives Employment of Euphemisms for an Assault on Mexican American Studies

    Get PDF
    This study details the political climate and logic priming the termination of Mexican American Studies in elementary and high school programs within the state of Arizona. The author applies conceptual content analysis and intertextuality to decode euphemisms incorporated by opponents of the program. Primary sources by the state’s Attorney General Tom Horne and school board Superintendent of Public Instruction John Huppenthal are examined for rationales used in the elimination of a pedagogically empowering program for Latina/o students within Tucson Unified School District. Repetitive paradoxes in arguments against Mexican American Studies are found to have implicitly formed a threat to the majority. Reasoning in public statements by the aforementioned politicians and frames for discussion of the program are concluded to have appealed to mainstream audiences as a decoy from alternative motives of maintaining current power structures with Latina/os subjugated to lower socio-economic statuses compared to White counterparts

    Loyalty Programme Applications in Indian Service Industry

    Get PDF
    Retaining all customers would not be a good idea for any business. In contrast, allowing the profitable customers to leave would be an even worse idea. Consequently the real solution rests in knowing the value of each customer and then focusing loyalty efforts on those customers. Customers are more likely to be loyal to a group of brands than to a single brand. This is particularly true if the chosen brand is the category leader and costs more. In contrast to the one – brand- for – life mentality of the past, today’s consumers are blatant in their divided loyalties, for their own safety and pleasure. The conceptual framework presented helps to understand the evolving logic of loyalty programs and process of implementing the same. Applications in different service industry for building and sustaining loyalty provide an overview of the status of such programmes.

    Using collaborative logic analysis evaluation to test the program theory of an intensive interdisciplinary pain treatment for youth with pain‐related disability

    Get PDF
    Abstract : Pediatric pain rehabilitation programs are complex and involve multiple stakeholders. Mapping the program components to its anticipated outcomes (i.e., its theory) can be difficult and requires stakeholder engagement. Evidence is lacking however on how best to engage them. Logic analysis, a theory-based evaluation, which tests the coherence of a program’s theory using scientific evidence and experiential knowledge, may hold some promise. Its use is rare in pediatric pain rehabilitation and few methodological details are available. This article provides a description of a collaborative logic analysis methodology used as the first step in the evaluation of an intensive interdisciplinary pain treatment program designed for youth with pain-related disability. A three-step direct logic analysis process was used. A 13-member expert panel, composed of clinicians, teachers, managers, youth with pain-related disability and their parents were engaged in each step. First, a logic model was constructed through document analysis, expert panel surveys and focus-group discussions. Then, a scoping review, focused on pediatric self-management, building self-efficacy, and fostering participation helped create a conceptual framework. Finally, an examination of the logic model against the conceptual framework by the expert panel followed, and recommendations were formulated. Overall, the collaborative logic analysis process helped raised awareness of clinicians’ assumptions about the program causal mechanism, identified program components most valued by youth and their parents; and recognized the program features supported by scientific and experiential knowledge, detected gaps and highlighted emerging trends. In addition to proving a consumer-focused program evaluation option, collaborative logic analysis methodology holds promise as a novel strategy to engage stakeholders and to translate pediatric pain rehabilitation evaluation research knowledge to key stakeholders

    Formal verification of mathematical software

    Get PDF
    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software

    The Shadow Knows: Refinement and security in sequential programs

    Get PDF
    AbstractStepwise refinement is a crucial conceptual tool for system development, encouraging program construction via a number of separate correctness-preserving stages which ideally can be understood in isolation. A crucial conceptual component of security is an adversary’s ignorance of concealed information. We suggest a novel method of combining these two ideas.Our suggestion is based on a mathematical definition of “ignorance-preserving” refinement that extends classical refinement by limiting an adversary’s access to concealed information: moving from specification to implementation should never increase that access. The novelty is the way we achieve this in the context of sequential programs.Specifically we give an operational model (and detailed justification for it), a basic sequential programming language and its operational semantics in that model, a “logic of ignorance” interpreted over the same model, then a program-logical semantics bringing those together — and finally we use the logic to establish, via refinement, the correctness of a real (though small) protocol: Rivest’s Oblivious Transfer. A previous report⋆ treated Chaum’s Dining Cryptographers similarly.In passing we solve the Refinement Paradox for sequential programs

    The Role of Deontic Logic in the Specification of Information Systems

    Get PDF
    In this paper we discuss the role that deontic logic plays in the specification of information systems, either because constraints on the systems directly concern norms or, and even more importantly, system constraints are considered ideal but violable (so-called `soft¿ constraints).\ud To overcome the traditional problems with deontic logic (the so-called paradoxes), we first state the importance of distinguishing between ought-to-be and ought-to-do constraints and next focus on the most severe paradox, the so-called Chisholm paradox, involving contrary-to-duty norms. We present a multi-modal extension of standard deontic logic (SDL) to represent the ought-to-be version of the Chisholm set properly. For the ought-to-do variant we employ a reduction to dynamic logic, and show how the Chisholm set can be treated adequately in this setting. Finally we discuss a way of integrating both ought-to-be and ought-to-do reasoning, enabling one to draw conclusions from ought-to-be constraints to ought-to-do ones, and show by an example the use(fulness) of this
    corecore