117,790 research outputs found

    StochasticNet in StochasticNet

    Get PDF
    Deep neural networks have been shown to outperform conventionalstate-of-the-art approaches in several structured predictionapplications. While high-performance computing devices such asGPUs has made developing very powerful deep neural networkspossible, it is not feasible to run these networks on low-cost, lowpowercomputing devices such as embedded CPUs or even embeddedGPUs. As such, there has been a lot of recent interestto produce efficient deep neural network architectures that can berun on small computing devices. Motivated by this, the idea ofStochasticNets was introduced, where deep neural networks areformed by leveraging random graph theory. It has been shownthat StochasticNet can form new networks with 2X or 3X architecturalefficiency while maintaining modeling accuracy. Motivated bythese promising results, here we investigate the idea of Stochastic-Net in StochasticNet (SiS), where highly-efficient deep neural networkswith Network in Network (NiN) architectures are formed ina stochastic manner. Such networks have an intertwining structurecomposed of convolutional layers and micro neural networksto boost the modeling accuracy. The experimental results showthat SiS can form deep neural networks with NiN architectures thathave 4X greater architectural efficiency with only a 2% dropin accuracy for the CIFAR10 dataset. The results are even morepromising for the SVHN dataset, where SiS formed deep neuralnetworks with NiN architectures that have 11.5X greater architecturalefficiency with only a 1% decrease in modeling accuracy

    Knowledge Infused Learning (K-IL): Towards Deep Incorporation of Knowledge in Deep Learning

    Get PDF
    Learning the underlying patterns in data goes beyond instance-based generalization to external knowledge represented in structured graphs or networks. Deep learning that primarily constitutes neural computing stream in AI has shown significant advances in probabilistically learning latent patterns using a multi-layered network of computational nodes (i.e., neurons/hidden units). Structured knowledge that underlies symbolic computing approaches and often supports reasoning, has also seen significant growth in recent years, in the form of broad-based (e.g., DBPedia, Yago) and domain, industry or application specific knowledge graphs. A common substrate with careful integration of the two will raise opportunities to develop neuro-symbolic learning approaches for AI, where conceptual and probabilistic representations are combined. As the incorporation of external knowledge will aid in supervising the learning of features for the model, deep infusion of representational knowledge from knowledge graphs within hidden layers will further enhance the learning process. Although much work remains, we believe that knowledge graphs will play an increasing role in developing hybrid neuro-symbolic intelligent systems (bottom-up deep learning with top-down symbolic computing) as well as in building explainable AI systems for which knowledge graphs will provide scaffolding for punctuating neural computing. In this position paper, we describe our motivation for such a neuro-symbolic approach and framework that combines knowledge graph and neural networks

    Learning through structure: towards deep neuromorphic knowledge graph embeddings

    Full text link
    Computing latent representations for graph-structured data is an ubiquitous learning task in many industrial and academic applications ranging from molecule synthetization to social network analysis and recommender systems. Knowledge graphs are among the most popular and widely used data representations related to the Semantic Web. Next to structuring factual knowledge in a machine-readable format, knowledge graphs serve as the backbone of many artificial intelligence applications and allow the ingestion of context information into various learning algorithms. Graph neural networks attempt to encode graph structures in low-dimensional vector spaces via a message passing heuristic between neighboring nodes. Over the recent years, a multitude of different graph neural network architectures demonstrated ground-breaking performances in many learning tasks. In this work, we propose a strategy to map deep graph learning architectures for knowledge graph reasoning to neuromorphic architectures. Based on the insight that randomly initialized and untrained (i.e., frozen) graph neural networks are able to preserve local graph structures, we compose a frozen neural network with shallow knowledge graph embedding models. We experimentally show that already on conventional computing hardware, this leads to a significant speedup and memory reduction while maintaining a competitive performance level. Moreover, we extend the frozen architecture to spiking neural networks, introducing a novel, event-based and highly sparse knowledge graph embedding algorithm that is suitable for implementation in neuromorphic hardware.Comment: Accepted for publication at the International Conference on Neuromorphic Computing (ICNC 2021

    Soft computing techniques applied to finance

    Get PDF
    Soft computing is progressively gaining presence in the financial world. The number of real and potential applications is very large and, accordingly, so is the presence of applied research papers in the literature. The aim of this paper is both to present relevant application areas, and to serve as an introduction to the subject. This paper provides arguments that justify the growing interest in these techniques among the financial community and introduces domains of application such as stock and currency market prediction, trading, portfolio management, credit scoring or financial distress prediction areas.Publicad
    • …
    corecore