278,796 research outputs found

    Metrics for Measuring Data Quality - Foundations for an Economic Oriented Management of Data Quality

    Get PDF
    The article develops metrics for an economic oriented management of data quality. Two data quality dimensions are focussed: consistency and timeliness. For deriving adequate metrics several requirements are stated (e. g. normalisation, cardinality, adaptivity, interpretability). Then the authors discuss existing approaches for measuring data quality and illustrate their weaknesses. Based upon these considerations, new metrics are developed for the data quality dimensions consistency and timeliness. These metrics are applied in practice and the results are illustrated in the case of a major German mobile services provider

    The New Quantum Logic

    Full text link
    It is shown how all the major conceptual difficulties of standard (textbook) quantum mechanics, including the two measurement problems and the (supposed) nonlocality that conflicts with special relativity, are resolved in the consistent or decoherent histories interpretation of quantum mechanics by using a modified form of quantum logic to discuss quantum properties (subspaces of the quantum Hilbert space), and treating quantum time development as a stochastic process. The histories approach in turn gives rise to some conceptual difficulties, in particular the correct choice of a framework (probabilistic sample space) or family of histories, and these are discussed. The central issue is that the principle of unicity, the idea that there is a unique single true description of the world, is incompatible with our current understanding of quantum mechanics.Comment: Minor changes and corrections to bring into conformity with published versio

    A Consistent Quantum Ontology

    Full text link
    The (consistent or decoherent) histories interpretation provides a consistent realistic ontology for quantum mechanics, based on two main ideas. First, a logic (system of reasoning) is employed which is compatible with the Hilbert-space structure of quantum mechanics as understood by von Neumann: quantum properties and their negations correspond to subspaces and their orthogonal complements. It employs a special (single framework) syntactical rule to construct meaningful quantum expressions, quite different from the quantum logic of Birkhoff and von Neumann. Second, quantum time development is treated as an inherently stochastic process under all circumstances, not just when measurements take place. The time-dependent Schr\"odinger equation provides probabilities, not a deterministic time development of the world. The resulting interpretive framework has no measurement problem and can be used to analyze in quantum terms what is going on before, after, and during physical preparation and measurement processes. In particular, appropriate measurements can reveal quantum properties possessed by the measured system before the measurement took place. There are no mysterious superluminal influences: quantum systems satisfy an appropriate form of Einstein locality. This ontology provides a satisfactory foundation for quantum information theory, since it supplies definite answers as to what the information is about. The formalism of classical (Shannon) information theory applies without change in suitable quantum contexts, and this suggests the way in which quantum information theory extends beyond its classical counterpart.Comment: Very minor revisions to previous versio

    Decoherence, the measurement problem, and interpretations of quantum mechanics

    Get PDF
    Environment-induced decoherence and superselection have been a subject of intensive research over the past two decades, yet their implications for the foundational problems of quantum mechanics, most notably the quantum measurement problem, have remained a matter of great controversy. This paper is intended to clarify key features of the decoherence program, including its more recent results, and to investigate their application and consequences in the context of the main interpretive approaches of quantum mechanics.Comment: 41 pages. Final published versio

    Enhancing Decision Tree based Interpretation of Deep Neural Networks through L1-Orthogonal Regularization

    Full text link
    One obstacle that so far prevents the introduction of machine learning models primarily in critical areas is the lack of explainability. In this work, a practicable approach of gaining explainability of deep artificial neural networks (NN) using an interpretable surrogate model based on decision trees is presented. Simply fitting a decision tree to a trained NN usually leads to unsatisfactory results in terms of accuracy and fidelity. Using L1-orthogonal regularization during training, however, preserves the accuracy of the NN, while it can be closely approximated by small decision trees. Tests with different data sets confirm that L1-orthogonal regularization yields models of lower complexity and at the same time higher fidelity compared to other regularizers.Comment: 8 pages, 18th IEEE International Conference on Machine Learning and Applications (ICMLA) 201

    EPR, Bell, and Quantum Locality

    Full text link
    Maudlin has claimed that no local theory can reproduce the predictions of standard quantum mechanics that violate Bell's inequality for Bohm's version (two spin-half particles in a singlet state) of the Einstein-Podolsky-Rosen problem. It is argued that, on the contrary, standard quantum mechanics itself is a counterexample to Maudlin's claim, because it is local in the appropriate sense (measurements at one place do not influence what occurs elsewhere there) when formulated using consistent principles in place of the inconsistent appeals to "measurement" found in current textbooks. This argument sheds light on the claim of Blaylock that counterfactual definiteness is an essential ingredient in derivations of Bell's inequality.Comment: Minor revisions to previous versio
    • …
    corecore