6,065 research outputs found

    Symbolic methodology for numeric data mining

    Get PDF
    Currently statistical and artificial neural network methods dominate in data mining applications. Alternative relational (symbolic) data mining methods have shown their effectiveness in robotics, drug design, and other areas. Neural networks and decision tree methods have serious limitations in capturing relations that may have a variety of forms. Learning systems based on symbolic first-order logic (FOL) representations capture relations naturally. The learned regularities are understandable directly in domain terms that help to build a domain theory. This paper describes relational data mining methodology and develops it further for numeric data such as financial and spatial data. This includes (1) comparing the attribute-value representation with the relational representation, (2) defining a new concept of joint relational representations, (3) a process of their use, and the Discovery algorithm. This methodology handles uniformly the numerical and interval forecasting tasks as well as classification tasks. It is shown that Relational Data Mining (RDM) can handle multiple constrains, initial rules and background knowledge very naturally to reduce the search space in contrast with attribute-based data mining. Theoretical concepts are illustrated with examples from financial and image processing domains

    An Interpretable Knowledge Transfer Model for Knowledge Base Completion

    Full text link
    Knowledge bases are important resources for a variety of natural language processing tasks but suffer from incompleteness. We propose a novel embedding model, \emph{ITransF}, to perform knowledge base completion. Equipped with a sparse attention mechanism, ITransF discovers hidden concepts of relations and transfer statistical strength through the sharing of concepts. Moreover, the learned associations between relations and concepts, which are represented by sparse attention vectors, can be interpreted easily. We evaluate ITransF on two benchmark datasets---WN18 and FB15k for knowledge base completion and obtains improvements on both the mean rank and Hits@10 metrics, over all baselines that do not use additional information.Comment: Accepted by ACL 2017. Minor updat

    Substructure Discovery Using Minimum Description Length and Background Knowledge

    Full text link
    The ability to identify interesting and repetitive substructures is an essential component to discovering knowledge in structural data. We describe a new version of our SUBDUE substructure discovery system based on the minimum description length principle. The SUBDUE system discovers substructures that compress the original data and represent structural concepts in the data. By replacing previously-discovered substructures in the data, multiple passes of SUBDUE produce a hierarchical description of the structural regularities in the data. SUBDUE uses a computationally-bounded inexact graph match that identifies similar, but not identical, instances of a substructure and finds an approximate measure of closeness of two substructures when under computational constraints. In addition to the minimum description length principle, other background knowledge can be used by SUBDUE to guide the search towards more appropriate substructures. Experiments in a variety of domains demonstrate SUBDUE's ability to find substructures capable of compressing the original data and to discover structural concepts important to the domain. Description of Online Appendix: This is a compressed tar file containing the SUBDUE discovery system, written in C. The program accepts as input databases represented in graph form, and will output discovered substructures with their corresponding value.Comment: See http://www.jair.org/ for an online appendix and other files accompanying this articl

    LOGICAL AND PSYCHOLOGICAL PARTITIONING OF MIND: DEPICTING THE SAME MAP?

    Get PDF
    The aim of this paper is to demonstrate that empirically delimited structures of mind are also differentiable by means of systematic logical analysis. In the sake of this aim, the paper first summarizes Demetriou's theory of cognitive organization and growth. This theory assumes that the mind is a multistructural entity that develops across three fronts: the processing system that constrains processing potentials, a set of specialized structural systems (SSSs) that guide processing within different reality and knowledge domains, and a hypecognitive system that monitors and controls the functioning of all other systems. In the second part the paper focuses on the SSSs, which are the target of our logical analysis, and it summarizes a series of empirical studies demonstrating their autonomous operation. The third part develops the logical proof showing that each SSS involves a kernel element that cannot be reduced to standard logic or to any other SSS. The implications of this analysis for the general theory of knowledge and cognitive development are discussed in the concluding part of the paper
    • …
    corecore