434 research outputs found

    Computing FO-Rewritings in EL in Practice: from Atomic to Conjunctive Queries

    Full text link
    A prominent approach to implementing ontology-mediated queries (OMQs) is to rewrite into a first-order query, which is then executed using a conventional SQL database system. We consider the case where the ontology is formulated in the description logic EL and the actual query is a conjunctive query and show that rewritings of such OMQs can be efficiently computed in practice, in a sound and complete way. Our approach combines a reduction with a decomposed backwards chaining algorithm for OMQs that are based on the simpler atomic queries, also illuminating the relationship between first-order rewritings of OMQs based on conjunctive and on atomic queries. Experiments with real-world ontologies show promising results

    Insulin Glargine in the Intensive Care Unit: A Model-Based Clinical Trial Design

    Get PDF
    Online 4 Oct 2012Introduction: Current succesful AGC (Accurate Glycemic Control) protocols require extra clinical effort and are impractical in less acute wards where patients are still susceptible to stress-induced hyperglycemia. Long-acting insulin Glargine has the potential to be used in a low effort controller. However, potential variability in efficacy and length of action, prevent direct in-hospital use in an AGC framework for less acute wards. Method: Clinically validated virtual trials based on data from stable ICU patients from the SPRINT cohort who would be transferred to such an approach are used to develop a 24-hour AGC protocol robust to different Glargine potencies (1.0x, 1.5x and 2.0x regular insulin) and initial dose sizes (dose = total insulin over prior 12, 18 and 24 hours). Glycemic control in this period is provided only by varying nutritional inputs. Performance is assessed as %BG in the 4.0-8.0mmol/L band and safety by %BG<4.0mmol/L. Results: The final protocol consisted of Glargine bolus size equal to insulin over the previous 18 hours. Compared to SPRINT there was a 6.9% - 9.5% absolute decrease in mild hypoglycemia (%BG<4.0mmol/L) and up to a 6.2% increase in %BG between 4.0 and 8.0mmol/L. When the efficacy is known (1.5x assumed) there were reductions of: 27% BG measurements, 59% insulin boluses, 67% nutrition changes, and 6.3% absolute in mild hypoglycemia. Conclusion: A robust 24-48 clinical trial has been designed to safely investigate the efficacy and kinetics of Glargine as a first step towards developing a Glargine-based protocol for less acute wards. Ensuring robustness to variability in Glargine efficacy significantly affects the performance and safety that can be obtained

    Nominal Unification of Higher Order Expressions with Recursive Let

    Get PDF
    A sound and complete algorithm for nominal unification of higher-order expressions with a recursive let is described, and shown to run in non-deterministic polynomial time. We also explore specializations like nominal letrec-matching for plain expressions and for DAGs and determine the complexity of corresponding unification problems.Comment: Pre-proceedings paper presented at the 26th International Symposium on Logic-Based Program Synthesis and Transformation (LOPSTR 2016), Edinburgh, Scotland UK, 6-8 September 2016 (arXiv:1608.02534

    New results on rewrite-based satisfiability procedures

    Full text link
    Program analysis and verification require decision procedures to reason on theories of data structures. Many problems can be reduced to the satisfiability of sets of ground literals in theory T. If a sound and complete inference system for first-order logic is guaranteed to terminate on T-satisfiability problems, any theorem-proving strategy with that system and a fair search plan is a T-satisfiability procedure. We prove termination of a rewrite-based first-order engine on the theories of records, integer offsets, integer offsets modulo and lists. We give a modularity theorem stating sufficient conditions for termination on a combinations of theories, given termination on each. The above theories, as well as others, satisfy these conditions. We introduce several sets of benchmarks on these theories and their combinations, including both parametric synthetic benchmarks to test scalability, and real-world problems to test performances on huge sets of literals. We compare the rewrite-based theorem prover E with the validity checkers CVC and CVC Lite. Contrary to the folklore that a general-purpose prover cannot compete with reasoners with built-in theories, the experiments are overall favorable to the theorem prover, showing that not only the rewriting approach is elegant and conceptually simple, but has important practical implications.Comment: To appear in the ACM Transactions on Computational Logic, 49 page

    Quantifier-Free Interpolation of a Theory of Arrays

    Get PDF
    The use of interpolants in model checking is becoming an enabling technology to allow fast and robust verification of hardware and software. The application of encodings based on the theory of arrays, however, is limited by the impossibility of deriving quantifier- free interpolants in general. In this paper, we show that it is possible to obtain quantifier-free interpolants for a Skolemized version of the extensional theory of arrays. We prove this in two ways: (1) non-constructively, by using the model theoretic notion of amalgamation, which is known to be equivalent to admit quantifier-free interpolation for universal theories; and (2) constructively, by designing an interpolating procedure, based on solving equations between array updates. (Interestingly, rewriting techniques are used in the key steps of the solver and its proof of correctness.) To the best of our knowledge, this is the first successful attempt of computing quantifier- free interpolants for a variant of the theory of arrays with extensionality

    A Formalization of the Theorem of Existence of First-Order Most General Unifiers

    Full text link
    This work presents a formalization of the theorem of existence of most general unifiers in first-order signatures in the higher-order proof assistant PVS. The distinguishing feature of this formalization is that it remains close to the textbook proofs that are based on proving the correctness of the well-known Robinson's first-order unification algorithm. The formalization was applied inside a PVS development for term rewriting systems that provides a complete formalization of the Knuth-Bendix Critical Pair theorem, among other relevant theorems of the theory of rewriting. In addition, the formalization methodology has been proved of practical use in order to verify the correctness of unification algorithms in the style of the original Robinson's unification algorithm.Comment: In Proceedings LSFA 2011, arXiv:1203.542

    Loops under Strategies ... Continued

    Full text link
    While there are many approaches for automatically proving termination of term rewrite systems, up to now there exist only few techniques to disprove their termination automatically. Almost all of these techniques try to find loops, where the existence of a loop implies non-termination of the rewrite system. However, most programming languages use specific evaluation strategies, whereas loop detection techniques usually do not take strategies into account. So even if a rewrite system has a loop, it may still be terminating under certain strategies. Therefore, our goal is to develop decision procedures which can determine whether a given loop is also a loop under the respective evaluation strategy. In earlier work, such procedures were presented for the strategies of innermost, outermost, and context-sensitive evaluation. In the current paper, we build upon this work and develop such decision procedures for important strategies like leftmost-innermost, leftmost-outermost, (max-)parallel-innermost, (max-)parallel-outermost, and forbidden patterns (which generalize innermost, outermost, and context-sensitive strategies). In this way, we obtain the first approach to disprove termination under these strategies automatically.Comment: In Proceedings IWS 2010, arXiv:1012.533

    Building and Combining Matching Algorithms

    Get PDF
    International audienceThe concept of matching is ubiquitous in declarative programming and in automated reasoning. For instance, it is a key mechanism to run rule-based programs and to simplify clauses generated by theorem provers. A matching problem can be seen as a particular conjunction of equations where each equation has a ground side. We give an overview of techniques that can be applied to build and combine matching algorithms. First, we survey mutation-based techniques as a way to build a generic matching algorithm for a large class of equational theories. Second, combination techniques are introduced to get combined matching algorithms for disjoint unions of theories. Then we show how these combination algorithms can be extended to handle non-disjoint unions of theories sharing only constructors. These extensions are possible if an appropriate notion of normal form is computable

    A Survey of Volunteered Open Geo-Knowledge Bases in the Semantic Web

    Full text link
    Over the past decade, rapid advances in web technologies, coupled with innovative models of spatial data collection and consumption, have generated a robust growth in geo-referenced information, resulting in spatial information overload. Increasing 'geographic intelligence' in traditional text-based information retrieval has become a prominent approach to respond to this issue and to fulfill users' spatial information needs. Numerous efforts in the Semantic Geospatial Web, Volunteered Geographic Information (VGI), and the Linking Open Data initiative have converged in a constellation of open knowledge bases, freely available online. In this article, we survey these open knowledge bases, focusing on their geospatial dimension. Particular attention is devoted to the crucial issue of the quality of geo-knowledge bases, as well as of crowdsourced data. A new knowledge base, the OpenStreetMap Semantic Network, is outlined as our contribution to this area. Research directions in information integration and Geographic Information Retrieval (GIR) are then reviewed, with a critical discussion of their current limitations and future prospects
    • 

    corecore