3,338 research outputs found

    Lifted rule injection for relation embeddings

    Get PDF
    Methods based on representation learning currently hold the state-of-the-art in many natural language processing and knowledge base inference tasks. Yet, a major challenge is how to efficiently incorporate commonsense knowledge into such models. A recent approach regularizes relation and entity representations by propositionalization of first-order logic rules. However, propositionalization does not scale beyond domains with only few entities and rules. In this paper we present a highly efficient method for incorporating implication rules into distributed representations for automated knowledge base construction. We map entity-tuple embeddings into an approximately Boolean space and encourage a partial ordering over relation embeddings based on implication rules mined from WordNet. Surprisingly, we find that the strong restriction of the entity-tuple embedding space does not hurt the expressiveness of the model and even acts as a regularizer that improves generalization. By incorporating few commonsense rules, we achieve an increase of 2 percentage points mean average precision over a matrix factorization baseline, while observing a negligible increase in runtime

    Derivative-variable correlation reveals the structure of dynamical networks

    Full text link
    We propose a conceptually novel method of reconstructing the topology of dynamical networks. By examining the correlation between the variable of one node and the derivative of another node, we derive a simple matrix equation yielding the network adjacency matrix. Our assumptions are the possession of time series describing the network dynamics, and the precise knowledge of the interaction functions. Our method involves a tunable parameter, allowing for the reconstruction precision to be optimized within the constraints of given dynamical data. The method is illustrated on a simple example, and the dependence of the reconstruction precision on the dynamical properties of time series is discussed. Our theory is in principle applicable to any weighted or directed network whose internal interaction functions are known.Comment: Submitted to EPJ

    Binary Independent Component Analysis with OR Mixtures

    Full text link
    Independent component analysis (ICA) is a computational method for separating a multivariate signal into subcomponents assuming the mutual statistical independence of the non-Gaussian source signals. The classical Independent Components Analysis (ICA) framework usually assumes linear combinations of independent sources over the field of realvalued numbers R. In this paper, we investigate binary ICA for OR mixtures (bICA), which can find applications in many domains including medical diagnosis, multi-cluster assignment, Internet tomography and network resource management. We prove that bICA is uniquely identifiable under the disjunctive generation model, and propose a deterministic iterative algorithm to determine the distribution of the latent random variables and the mixing matrix. The inverse problem concerning inferring the values of latent variables are also considered along with noisy measurements. We conduct an extensive simulation study to verify the effectiveness of the propose algorithm and present examples of real-world applications where bICA can be applied.Comment: Manuscript submitted to IEEE Transactions on Signal Processin

    Data based identification and prediction of nonlinear and complex dynamical systems

    Get PDF
    We thank Dr. R. Yang (formerly at ASU), Dr. R.-Q. Su (formerly at ASU), and Mr. Zhesi Shen for their contributions to a number of original papers on which this Review is partly based. This work was supported by ARO under Grant No. W911NF-14-1-0504. W.-X. Wang was also supported by NSFC under Grants No. 61573064 and No. 61074116, as well as by the Fundamental Research Funds for the Central Universities, Beijing Nova Programme.Peer reviewedPostprin
    corecore