21,075 research outputs found

    Confluence Modulo Equivalence in Constraint Handling Rules

    Get PDF
    Previous results on proving confluence for Constraint Handling Rules are extended in two ways in order to allow a larger and more realistic class of CHR programs to be considered confluent. Firstly, we introduce the relaxed notion of confluence modulo equivalence into the context of CHR: while confluence for a terminating program means that all alternative derivations for a query lead to the exact same final state, confluence modulo equivalence only requires the final states to be equivalent with respect to an equivalence relation tailored for the given program. Secondly, we allow non-logical built-in predicates such as var/1 and incomplete ones such as is/2, that are ignored in previous work on confluence. To this end, a new operational semantics for CHR is developed which includes such predicates. In addition, this semantics differs from earlier approaches by its simplicity without loss of generality, and it may also be recommended for future studies of CHR. For the purely logical subset of CHR, proofs can be expressed in first-order logic, that we show is not sufficient in the present case. We have introduced a formal meta-language that allows reasoning about abstract states and derivations with meta-level restrictions that reflect the non-logical and incomplete predicates. This language represents subproofs as diagrams, which facilitates a systematic enumeration of proof cases, pointing forward to a mechanical support for such proofs

    Programming Wireless Sensor Networks with Logical Neighborhoods

    Get PDF
    Wireless sensor network (WSN) architectures often feature a (single) base station in charge of coordinating the application functionality. Although this assumption simplified the path to adoption of WSN technology, researchers are now being attracted by more decentralized architectures with multiple sinks and heterogeneous nodes. These scenarios are brought to an extreme in Wireless Sensor and Actor Networks (WSANs), where sensing and acting nodes collaborate in a decentralized fashion to implement complex control loops. In these settings, new programming abstractions are required to manage complexity and heterogeneity without sacrificing efficiency. In this paper we propose and define a logical neighbor-hood programming abstraction. A logical neighborhood includes nearby nodes that satisfy predicates over their static (e.g., type) or dynamic (e.g., sensed values) characteristics. The span of the neighborhood and the definition of its predicates are specified declaratively, along with requirements about the cost of the communication involved. Logical neighborhoods enable the programmer to “illuminate” different areas of the network according to the application needs, effectively replacing the physical neighborhood provided by wireless broadcast with a higher-level, application-defined notion of proximity. This paper presents the definition of a declarative language for specifying logical neighborhoods, highlighting its expressiveness, flexibility and simplicity. Moreover, although the language con- structs are readily implemented using existing communication mechanisms, we briefly report about a novel routing scheme we expressly designed to support efficiently our abstractions

    Eligibility and inscrutability

    Get PDF
    The philosophy of intentionality asks questions such as: in virtue of what does a sentence, picture, or mental state represent that the world is a certain way? The subquestion I focus upon here concerns the semantic properties of language: in virtue of what does a name such as ‘London’ refer to something or a predicate such as ‘is large’ apply to some object? This essay examines one kind of answer to this “metasemantic”1 question: interpretationism, instances of which have been proposed by Donald Davidson, David Lewis, and others. I characterize the “twostep” form common to such approaches and briefl y say how two versions described by David Lewis fi t this pattern. Then I describe a fundamental challenge to this approach: a “permutation argument” that contends, by interpretationist lights, there can be no fact of the matter about lexical content (e.g., what individual words refer to). Such a thesis cannot be sustained, so the argument threatens a reductio of interpretationism. In the second part of the article, I will give what I take to be the best interpretationist response to the inscrutability paradox: David Lewis’s appeal to the differential “eligibility” of semantic theories. I contend that, given an independently plausible formulation of interpretationism, the eligibility response is an immediate consequence of Lewis’s general analysis of the theoretical virtue of simplicity. In the fi nal sections of the article, I examine the limitations of Lewis’s response. By focusing on an alternative argument for the inscrutability of reference, I am able to describe conditions under which the eligibility result will deliver the wrong results. In particular, if the world is complex enough and our language suffi ciently simple, then reference may be determinately secured to the wrong things

    The Relation between God and the World in the Pre-Critical Kant: Was Kant a Spinozist?

    Get PDF
    Andrew Chignell and Omri Boehm have recently argued that Kant’s pre-Critical proof for the existence of God entails a Spinozistic conception of God and hence substance monism. The basis for this reading is the assumption common in the literature that God grounds possibilities by exemplifying them. In this article I take issue with this assumption and argue for an alternative Leibnizian reading, according to which possibilities are grounded in essences united in God’s mind (later also described as Platonic ideas intuited by God). I show that this view about the distinction between God’s cognition of essences as the ground of possibility and the actual world is not only explicitly stated by Kant, but is also consistent with his metaphysical picture of teleology in nature and causality during the pre-Critical period. Finally, I suggest that the distinction between the conceptual order of essences embodied in the idea of God and the order of the objects of experience plays a role in the transition into the Critical system, where it is transformed into the distinction between the intelligible and the sensible worlds

    Security Policy Consistency

    Full text link
    With the advent of wide security platforms able to express simultaneously all the policies comprising an organization's global security policy, the problem of inconsistencies within security policies become harder and more relevant. We have defined a tool based on the CHR language which is able to detect several types of inconsistencies within and between security policies and other specifications, namely workflow specifications. Although the problem of security conflicts has been addressed by several authors, to our knowledge none has addressed the general problem of security inconsistencies, on its several definitions and target specifications.Comment: To appear in the first CL2000 workshop on Rule-Based Constraint Reasoning and Programmin

    RDF-TR: Exploiting structural redundancies to boost RDF compression

    Get PDF
    The number and volume of semantic data have grown impressively over the last decade, promoting compression as an essential tool for RDF preservation, sharing and management. In contrast to universal compressors, RDF compression techniques are able to detect and exploit specific forms of redundancy in RDF data. Thus, state-of-the-art RDF compressors excel at exploiting syntactic and semantic redundancies, i.e., repetitions in the serialization format and information that can be inferred implicitly. However, little attention has been paid to the existence of structural patterns within the RDF dataset; i.e. structural redundancy. In this paper, we analyze structural regularities in real-world datasets, and show three schema-based sources of redundancies that underpin the schema-relaxed nature of RDF. Then, we propose RDF-Tr (RDF Triples Reorganizer), a preprocessing technique that discovers and removes this kind of redundancy before the RDF dataset is effectively compressed. In particular, RDF-Tr groups subjects that are described by the same predicates, and locally re-codes the objects related to these predicates. Finally, we integrate RDF-Tr with two RDF compressors, HDT and k2-triples. Our experiments show that using RDF-Tr with these compressors improves by up to 2.3 times their original effectiveness, outperforming the most prominent state-of-the-art techniques

    Changing a semantics: opportunism or courage?

    Full text link
    The generalized models for higher-order logics introduced by Leon Henkin, and their multiple offspring over the years, have become a standard tool in many areas of logic. Even so, discussion has persisted about their technical status, and perhaps even their conceptual legitimacy. This paper gives a systematic view of generalized model techniques, discusses what they mean in mathematical and philosophical terms, and presents a few technical themes and results about their role in algebraic representation, calibrating provability, lowering complexity, understanding fixed-point logics, and achieving set-theoretic absoluteness. We also show how thinking about Henkin's approach to semantics of logical systems in this generality can yield new results, dispelling the impression of adhocness. This paper is dedicated to Leon Henkin, a deep logician who has changed the way we all work, while also being an always open, modest, and encouraging colleague and friend.Comment: 27 pages. To appear in: The life and work of Leon Henkin: Essays on his contributions (Studies in Universal Logic) eds: Manzano, M., Sain, I. and Alonso, E., 201
    • 

    corecore