164,300 research outputs found

    From an implicational system to its corresponding D-basis

    Get PDF
    Closure system is a fundamental concept appearing in several areas such as databases, formal concept analysis, artificial intelligence, etc. It is well-known that there exists a connection between a closure operator on a set and the lattice of its closed sets. Furthermore, the closure system can be replaced by a set of implications but this set has usually a lot of redundancy inducing non desired properties. In the literature, there is a common interest in the search of the mini- mality of a set of implications because of the importance of bases. The well-known Duquenne-Guigues basis satisfies this minimality condition. However, several authors emphasize the relevance of the optimality in order to reduce the size of implications in the basis. In addition to this, some bases have been defined to improve the computation of closures relying on the directness property. The efficiency of computation with the direct basis is achieved due to the fact that the closure is computed in one traversal. In this work, we focus on the D-basis, which is ordered-direct. An open problem is to obtain it from an arbitrary implicational system, so it is our aim in this paper. We introduce a method to compute the D-basis by means of minimal generators calculated using the Simplification Logic for implications.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech. Supported by Grants TIN2011-28084 and TIN2014-59471-P of the Science and Innovation Ministry of Spain, which is co-financed by the European Social Fund

    Dark matter theory: Implications and future prospects for Fermi

    Full text link
    I give a brief review of some of the implications of Fermi data for theories of the identity of dark matter, and their combination with data from other complementary probes. I also preview some of the prospects for probing such models with future data.Comment: 8 pages, 5 figures. To appear in the proceedings of the 7th International Fermi Symposium, Garmish, Oct 15-20 201

    Towards Autopoietic Computing

    Full text link
    A key challenge in modern computing is to develop systems that address complex, dynamic problems in a scalable and efficient way, because the increasing complexity of software makes designing and maintaining efficient and flexible systems increasingly difficult. Biological systems are thought to possess robust, scalable processing paradigms that can automatically manage complex, dynamic problem spaces, possessing several properties that may be useful in computer systems. The biological properties of self-organisation, self-replication, self-management, and scalability are addressed in an interesting way by autopoiesis, a descriptive theory of the cell founded on the concept of a system's circular organisation to define its boundary with its environment. In this paper, therefore, we review the main concepts of autopoiesis and then discuss how they could be related to fundamental concepts and theories of computation. The paper is conceptual in nature and the emphasis is on the review of other people's work in this area as part of a longer-term strategy to develop a formal theory of autopoietic computing.Comment: 10 Pages, 3 figure

    Redundancy, Deduction Schemes, and Minimum-Size Bases for Association Rules

    Full text link
    Association rules are among the most widely employed data analysis methods in the field of Data Mining. An association rule is a form of partial implication between two sets of binary variables. In the most common approach, association rules are parameterized by a lower bound on their confidence, which is the empirical conditional probability of their consequent given the antecedent, and/or by some other parameter bounds such as "support" or deviation from independence. We study here notions of redundancy among association rules from a fundamental perspective. We see each transaction in a dataset as an interpretation (or model) in the propositional logic sense, and consider existing notions of redundancy, that is, of logical entailment, among association rules, of the form "any dataset in which this first rule holds must obey also that second rule, therefore the second is redundant". We discuss several existing alternative definitions of redundancy between association rules and provide new characterizations and relationships among them. We show that the main alternatives we discuss correspond actually to just two variants, which differ in the treatment of full-confidence implications. For each of these two notions of redundancy, we provide a sound and complete deduction calculus, and we show how to construct complete bases (that is, axiomatizations) of absolutely minimum size in terms of the number of rules. We explore finally an approach to redundancy with respect to several association rules, and fully characterize its simplest case of two partial premises.Comment: LMCS accepted pape

    On the Usability of Probably Approximately Correct Implication Bases

    Full text link
    We revisit the notion of probably approximately correct implication bases from the literature and present a first formulation in the language of formal concept analysis, with the goal to investigate whether such bases represent a suitable substitute for exact implication bases in practical use-cases. To this end, we quantitatively examine the behavior of probably approximately correct implication bases on artificial and real-world data sets and compare their precision and recall with respect to their corresponding exact implication bases. Using a small example, we also provide qualitative insight that implications from probably approximately correct bases can still represent meaningful knowledge from a given data set.Comment: 17 pages, 8 figures; typos added, corrected x-label on graph

    Simulating full-sky interferometric observations

    Full text link
    Aperture array interferometers, such as that proposed for the Square Kilometre Array (SKA), will see the entire sky, hence the standard approach to simulating visibilities will not be applicable since it relies on a tangent plane approximation that is valid only for small fields of view. We derive interferometric formulations in real, spherical harmonic and wavelet space that include contributions over the entire sky and do not rely on any tangent plane approximations. A fast wavelet method is developed to simulate the visibilities observed by an interferometer in the full-sky setting. Computing visibilities using the fast wavelet method adapts to the sparse representation of the primary beam and sky intensity in the wavelet basis. Consequently, the fast wavelet method exhibits superior computational complexity to the real and spherical harmonic space methods and may be performed at substantially lower computational cost, while introducing only negligible error to simulated visibilities. Low-resolution interferometric observations are simulated using all of the methods to compare their performance, demonstrating that the fast wavelet method is approximately three times faster that the other methods for these low-resolution simulations. The computational burden of the real and spherical harmonic space methods renders these techniques computationally infeasible for higher resolution simulations. High-resolution interferometric observations are simulated using the fast wavelet method only, demonstrating and validating the application of this method to realistic simulations. The fast wavelet method is estimated to provide a greater than ten-fold reduction in execution time compared to the other methods for these high-resolution simulations.Comment: 16 pages, 9 figures, replaced to match version accepted by MNRAS (major additions to previous version including new fast wavelet method
    • …
    corecore