74,192 research outputs found

    Quantitative Redundancy in Partial Implications

    Get PDF
    We survey the different properties of an intuitive notion of redundancy, as a function of the precise semantics given to the notion of partial implication. The final version of this survey will appear in the Proceedings of the Int. Conf. Formal Concept Analysis, 2015.Comment: Int. Conf. Formal Concept Analysis, 201

    Construct redundancy in process modelling grammars: Improving the explanatory power of ontological analysis

    Get PDF
    Conceptual modelling supports developers and users of information systems in areas of documentation, analysis or system redesign. The ongoing interest in the modelling of business processes has led to a variety of different grammars, raising the question of the quality of these grammars for modelling. An established way of evaluating the quality of a modelling grammar is by means of an ontological analysis, which can determine the extent to which grammars contain construct deficit, overload, excess or redundancy. While several studies have shown the relevance of most of these criteria, predictions about construct redundancy have yielded inconsistent results in the past, with some studies suggesting that redundancy may even be beneficial for modelling in practice. In this paper we seek to contribute to clarifying the concept of construct redundancy by introducing a revision to the ontological analysis method. Based on the concept of inheritance we propose an approach that distinguishes between specialized and distinct construct redundancy. We demonstrate the potential explanatory power of the revised method by reviewing and clarifying previous results found in the literature

    Relative Entailment Among Probabilistic Implications

    Get PDF
    We study a natural variant of the implicational fragment of propositional logic. Its formulas are pairs of conjunctions of positive literals, related together by an implicational-like connective; the semantics of this sort of implication is defined in terms of a threshold on a conditional probability of the consequent, given the antecedent: we are dealing with what the data analysis community calls confidence of partial implications or association rules. Existing studies of redundancy among these partial implications have characterized so far only entailment from one premise and entailment from two premises, both in the stand-alone case and in the case of presence of additional classical implications (this is what we call "relative entailment"). By exploiting a previously noted alternative view of the entailment in terms of linear programming duality, we characterize exactly the cases of entailment from arbitrary numbers of premises, again both in the stand-alone case and in the case of presence of additional classical implications. As a result, we obtain decision algorithms of better complexity; additionally, for each potential case of entailment, we identify a critical confidence threshold and show that it is, actually, intrinsic to each set of premises and antecedent of the conclusion

    A simple example of "Quantum Darwinism": Redundant information storage in many-spin environments

    Full text link
    As quantum information science approaches the goal of constructing quantum computers, understanding loss of information through decoherence becomes increasingly important. The information about a system that can be obtained from its environment can facilitate quantum control and error correction. Moreover, observers gain most of their information indirectly, by monitoring (primarily photon) environments of the "objects of interest." Exactly how this information is inscribed in the environment is essential for the emergence of "the classical" from the quantum substrate. In this paper, we examine how many-qubit (or many-spin) environments can store information about a single system. The information lost to the environment can be stored redundantly, or it can be encoded in entangled modes of the environment. We go on to show that randomly chosen states of the environment almost always encode the information so that an observer must capture a majority of the environment to deduce the system's state. Conversely, in the states produced by a typical decoherence process, information about a particular observable of the system is stored redundantly. This selective proliferation of "the fittest information" (known as Quantum Darwinism) plays a key role in choosing the preferred, effectively classical observables of macroscopic systems. The developing appreciation that the environment functions not just as a garbage dump, but as a communication channel, is extending our understanding of the environment's role in the quantum-classical transition beyond the traditional paradigm of decoherence.Comment: 21 pages, 6 figures, RevTex 4. Submitted to Foundations of Physics (Asher Peres Festschrift

    Measure Functions for Frames

    Full text link
    This paper addresses the natural question: ``How should frames be compared?'' We answer this question by quantifying the overcompleteness of all frames with the same index set. We introduce the concept of a frame measure function: a function which maps each frame to a continuous function. The comparison of these functions induces an equivalence and partial order that allows for a meaningful comparison of frames indexed by the same set. We define the ultrafilter measure function, an explicit frame measure function that we show is contained both algebraically and topologically inside all frame measure functions. We explore additional properties of frame measure functions, showing that they are additive on a large class of supersets-- those that come from so called non-expansive frames. We apply our results to the Gabor setting, computing the frame measure function of Gabor frames and establishing a new result about supersets of Gabor frames.Comment: 54 pages, 1 figure; fixed typos, reformatted reference

    Sonification, Musification, and Synthesis of Absolute Program Music

    Get PDF
    Presented at the 22nd International Conference on Auditory Display (ICAD-2016)When understood as a communication system, a musical work can be interpreted as data existing within three domains. In this interpretation an absolute domain is interposed as a communication channel between two programatic domains that act respectively as source and receiver. As a source, a programatic domain creates, evolves, organizes, and represents a musical work. When acting as a receiver it re-constitutes acoustic signals into unique auditory experience. The absolute domain transmits physical vibrations ranging from the stochastic structures of noise to the periodic waveforms of organized sound. Analysis of acoustic signals suggest recognition as a musical work requires signal periodicity to exceed some minimum. A methodological framework that satisfies recent definitions of sonification is outlined. This framework is proposed to extend to musification through incorporation of data features that represent more traditional elements of a musical work such as melody, harmony, and rhythm

    Exploring networks with traceroute-like probes: theory and simulations

    Get PDF
    Mapping the Internet generally consists in sampling the network from a limited set of sources by using traceroute-like probes. This methodology, akin to the merging of different spanning trees to a set of destination, has been argued to introduce uncontrolled sampling biases that might produce statistical properties of the sampled graph which sharply differ from the original ones. In this paper we explore these biases and provide a statistical analysis of their origin. We derive an analytical approximation for the probability of edge and vertex detection that exploits the role of the number of sources and targets and allows us to relate the global topological properties of the underlying network with the statistical accuracy of the sampled graph. In particular, we find that the edge and vertex detection probability depends on the betweenness centrality of each element. This allows us to show that shortest path routed sampling provides a better characterization of underlying graphs with broad distributions of connectivity. We complement the analytical discussion with a throughout numerical investigation of simulated mapping strategies in network models with different topologies. We show that sampled graphs provide a fair qualitative characterization of the statistical properties of the original networks in a fair range of different strategies and exploration parameters. Moreover, we characterize the level of redundancy and completeness of the exploration process as a function of the topological properties of the network. Finally, we study numerically how the fraction of vertices and edges discovered in the sampled graph depends on the particular deployements of probing sources. The results might hint the steps toward more efficient mapping strategies.Comment: This paper is related to cond-mat/0406404, with explorations of different networks and complementary discussion
    • …
    corecore