215 research outputs found

    Relative Riemann-Zariski spaces

    Full text link
    In this paper we study relative Riemann-Zariski spaces attached to a morphism of schemes and generalizing the classical Riemann-Zariski space of a field. We prove that similarly to the classical RZ spaces, the relative ones can be described either as projective limits of schemes in the category of locally ringed spaces or as certain spaces of valuations. We apply these spaces to prove the following two new results: a strong version of stable modification theorem for relative curves; a decomposition theorem which asserts that any separated morphism between quasi-compact and quasi-separated schemes factors as a composition of an affine morphism and a proper morphism. (In particular, we obtain a new proof of Nagata's compactification theorem.)Comment: 30 pages, the final version, to appear in Israel J. of Mat

    Building precompiled knowledge in ODeLP

    Get PDF
    Argumentation systems have substantially evolved in the past few years, resulting in adequate tools to model some forms of common sense reasoning. This has sprung a new set of argument-based applications in diverse areas. In previous work, we defined how to use precompiled knowledge to obtain significant speed-ups in the inference process of an argument-based system. This development is based on a logic programming system with an argumentation-driven inference engine, called Observation Based Defeasible Logic Programming (ODeLP). In this setting was first presented the concept of dialectical databases, that is, data structures for storing precompiled knowledge. These structures provide precompiled information about inferences and can be used to speed up the inference process, as TMS do in general problem solvers. In this work, we present detailed algorithms for the creation of dialectical databases in ODeLP and analyze these algorithms in terms of their computational complexity.Red de Universidades con Carreras en Informática (RedUNCI

    The ghosts of forgotten things: A study on size after forgetting

    Full text link
    Forgetting is removing variables from a logical formula while preserving the constraints on the other variables. In spite of being a form of reduction, it does not always decrease the size of the formula and may sometimes increase it. This article discusses the implications of such an increase and analyzes the computational properties of the phenomenon. Given a propositional Horn formula, a set of variables and a maximum allowed size, deciding whether forgetting the variables from the formula can be expressed in that size is DpD^p-hard in Σ2p\Sigma^p_2. The same problem for unrestricted propositional formulae is D2pD^p_2-hard in Σ3p\Sigma^p_3. The hardness results employ superredundancy: a superirredundant clause is in all formulae of minimal size equivalent to a given one. This concept may be useful outside forgetting

    On Relativized Minimality, memory and cue-based parsing

    Get PDF

    Inconsistency-tolerant Query Answering in Ontology-based Data Access

    Get PDF
    Ontology-based data access (OBDA) is receiving great attention as a new paradigm for managing information systems through semantic technologies. According to this paradigm, a Description Logic ontology provides an abstract and formal representation of the domain of interest to the information system, and is used as a sophisticated schema for accessing the data and formulating queries over them. In this paper, we address the problem of dealing with inconsistencies in OBDA. Our general goal is both to study DL semantical frameworks that are inconsistency-tolerant, and to devise techniques for answering unions of conjunctive queries under such inconsistency-tolerant semantics. Our work is inspired by the approaches to consistent query answering in databases, which are based on the idea of living with inconsistencies in the database, but trying to obtain only consistent information during query answering, by relying on the notion of database repair. We first adapt the notion of database repair to our context, and show that, according to such a notion, inconsistency-tolerant query answering is intractable, even for very simple DLs. Therefore, we propose a different repair-based semantics, with the goal of reaching a good compromise between the expressive power of the semantics and the computational complexity of inconsistency-tolerant query answering. Indeed, we show that query answering under the new semantics is first-order rewritable in OBDA, even if the ontology is expressed in one of the most expressive members of the DL-Lite family

    Building precompiled knowledge in ODeLP

    Get PDF
    Argumentation systems have substantially evolved in the past few years, resulting in adequate tools to model some forms of common sense reasoning. This has sprung a new set of argument-based applications in diverse areas. In previous work, we defined how to use precompiled knowledge to obtain significant speed-ups in the inference process of an argument-based system. This development is based on a logic programming system with an argumentation-driven inference engine, called Observation Based Defeasible Logic Programming (ODeLP). In this setting was first presented the concept of dialectical databases, that is, data structures for storing precompiled knowledge. These structures provide precompiled information about inferences and can be used to speed up the inference process, as TMS do in general problem solvers. In this work, we present detailed algorithms for the creation of dialectical databases in ODeLP and analyze these algorithms in terms of their computational complexity.Red de Universidades con Carreras en Informática (RedUNCI
    • …
    corecore