442,173 research outputs found

    Composition and Inversion of Schema Mappings

    Full text link
    In the recent years, a lot of attention has been paid to the development of solid foundations for the composition and inversion of schema mappings. In this paper, we review the proposals for the semantics of these crucial operators. For each of these proposals, we concentrate on the three following problems: the definition of the semantics of the operator, the language needed to express the operator, and the algorithmic issues associated to the problem of computing the operator. It should be pointed out that we primarily consider the formalization of schema mappings introduced in the work on data exchange. In particular, when studying the problem of computing the composition and inverse of a schema mapping, we will be mostly interested in computing these operators for mappings specified by source-to-target tuple-generating dependencies

    An Introduction to Quantum Error Correction and Fault-Tolerant Quantum Computation

    Full text link
    Quantum states are very delicate, so it is likely some sort of quantum error correction will be necessary to build reliable quantum computers. The theory of quantum error-correcting codes has some close ties to and some striking differences from the theory of classical error-correcting codes. Many quantum codes can be described in terms of the stabilizer of the codewords. The stabilizer is a finite Abelian group, and allows a straightforward characterization of the error-correcting properties of the code. The stabilizer formalism for quantum codes also illustrates the relationships to classical coding theory, particularly classical codes over GF(4), the finite field with four elements. To build a quantum computer which behaves correctly in the presence of errors, we also need a theory of fault-tolerant quantum computation, instructing us how to perform quantum gates on qubits which are encoded in a quantum error-correcting code. The threshold theorem states that it is possible to create a quantum computer to perform an arbitrary quantum computation provided the error rate per physical gate or time step is below some constant threshold value.Comment: 46 pages, with large margins. Includes quant-ph/0004072 plus 30 pages of new material, mostly on fault-toleranc

    Answer Sets for Consistent Query Answering in Inconsistent Databases

    Full text link
    A relational database is inconsistent if it does not satisfy a given set of integrity constraints. Nevertheless, it is likely that most of the data in it is consistent with the constraints. In this paper we apply logic programming based on answer sets to the problem of retrieving consistent information from a possibly inconsistent database. Since consistent information persists from the original database to every of its minimal repairs, the approach is based on a specification of database repairs using disjunctive logic programs with exceptions, whose answer set semantics can be represented and computed by systems that implement stable model semantics. These programs allow us to declare persistence by defaults and repairing changes by exceptions. We concentrate mainly on logic programs for binary integrity constraints, among which we find most of the integrity constraints found in practice.Comment: 34 page

    MDL Denoising Revisited

    Full text link
    We refine and extend an earlier MDL denoising criterion for wavelet-based denoising. We start by showing that the denoising problem can be reformulated as a clustering problem, where the goal is to obtain separate clusters for informative and non-informative wavelet coefficients, respectively. This suggests two refinements, adding a code-length for the model index, and extending the model in order to account for subband-dependent coefficient distributions. A third refinement is derivation of soft thresholding inspired by predictive universal coding with weighted mixtures. We propose a practical method incorporating all three refinements, which is shown to achieve good performance and robustness in denoising both artificial and natural signals.Comment: Submitted to IEEE Transactions on Information Theory, June 200

    Critical Behavior and Fractality in Shallow One-Dimensional Quasiperiodic Potentials

    Get PDF
    Quasiperiodic systems offer an appealing intermediate between long-range ordered and genuine disordered systems, with unusual critical properties. One-dimensional models that break the so-called self-dual symmetry usually display a mobility edge, similarly as truly disordered systems in dimension strictly higher than two. Here, we determine the critical localization properties of single particles in shallow, one-dimensional, quasiperiodic models and relate them to the fractal character of the energy spectrum. On the one hand, we determine the mobility edge and show that it separates the localized and extended phases, with no intermediate phase. On the other hand, we determine the critical potential amplitude and find the universal critical exponent Μ≃1/3\nu \simeq 1/3. We also study the spectral Hausdorff dimension and show that it is nonuniversal but always smaller than unity, hence showing that the spectrum is nowhere dense. Finally, applications to ongoing studies of Anderson localization, Bose-glass physics, and many-body localization in ultracold atoms are discussed

    A universal ontology-based approach to data integration

    Get PDF
    One of the main problems in building data integration systems is that of semantic integration. It has been acknowledged that the problem would not exist if all systems were developed using the same global schema, but so far, such global schema has been considered unfeasible in practice. However, in our previous work, we have argued that given the current state-of-the-art, a global schema may be feasible now, and we have put forward a vision of a Universal Ontology (UO) that may be desirable, feasible, and viable. One of the reasons why the UO may be desirable is that it might solve the semantic integration problem. The objective of this paper is to show that indeed the UO could solve, or at least greatly alleviate, the semantic integration problem. We do so by presenting an approach to semantic integration based on the UO that requires much less effort than other approaches.Peer ReviewedPostprint (published version
    • 

    corecore