135,630 research outputs found

    Predicate learning in neural systems:Using oscillations to discover latent structure

    Get PDF
    Humans learn to represent complex structures (e.g. natural language, music, mathematics) from experience with their environments. Often such structures are latent, hidden, or not encoded in statistics about sensory representations alone. Accounts of human cognition have long emphasized the importance of structured representations, yet the majority of contemporary neural networks do not learn structure from experience. Here, we describe one way that structured, functionally symbolic representations can be instantiated in an artificial neural network. Then, we describe how such latent structures (viz. predicates) can be learned from experience with unstructured data. Our approach exploits two principles from psychology and neuroscience: comparison of representations, and the naturally occurring dynamic properties of distributed computing across neuronal assemblies (viz. neural oscillations). We discuss how the ability to learn predicates from experience, to represent information compositionally, and to extrapolate knowledge to unseen data is core to understanding and modeling the most complex human behaviors (e.g. relational reasoning, analogy, language processing, game play)

    Learning through structure: towards deep neuromorphic knowledge graph embeddings

    Full text link
    Computing latent representations for graph-structured data is an ubiquitous learning task in many industrial and academic applications ranging from molecule synthetization to social network analysis and recommender systems. Knowledge graphs are among the most popular and widely used data representations related to the Semantic Web. Next to structuring factual knowledge in a machine-readable format, knowledge graphs serve as the backbone of many artificial intelligence applications and allow the ingestion of context information into various learning algorithms. Graph neural networks attempt to encode graph structures in low-dimensional vector spaces via a message passing heuristic between neighboring nodes. Over the recent years, a multitude of different graph neural network architectures demonstrated ground-breaking performances in many learning tasks. In this work, we propose a strategy to map deep graph learning architectures for knowledge graph reasoning to neuromorphic architectures. Based on the insight that randomly initialized and untrained (i.e., frozen) graph neural networks are able to preserve local graph structures, we compose a frozen neural network with shallow knowledge graph embedding models. We experimentally show that already on conventional computing hardware, this leads to a significant speedup and memory reduction while maintaining a competitive performance level. Moreover, we extend the frozen architecture to spiking neural networks, introducing a novel, event-based and highly sparse knowledge graph embedding algorithm that is suitable for implementation in neuromorphic hardware.Comment: Accepted for publication at the International Conference on Neuromorphic Computing (ICNC 2021

    Using deep learning to understand and mitigate the qubit noise environment

    Get PDF
    Understanding the spectrum of noise acting on a qubit can yield valuable information about its environment, and crucially underpins the optimization of dynamical decoupling protocols that can mitigate such noise. However, extracting accurate noise spectra from typical time-dynamics measurements on qubits is intractable using standard methods. Here, we propose to address this challenge using deep learning algorithms, leveraging the remarkable progress made in the field of image recognition, natural language processing, and more recently, structured data. We demonstrate a neural network based methodology that allows for extraction of the noise spectrum associated with any qubit surrounded by an arbitrary bath, with significantly greater accuracy than the current methods of choice. The technique requires only a two-pulse echo decay curve as input data and can further be extended either for constructing customized optimal dynamical decoupling protocols or for obtaining critical qubit attributes such as its proximity to the sample surface. Our results can be applied to a wide range of qubit platforms, and provide a framework for improving qubit performance with applications not only in quantum computing and nanoscale sensing but also in material characterization techniques such as magnetic resonance.Comment: Accepted for publication, 15 pages, 10 figure

    Soft computing techniques applied to finance

    Get PDF
    Soft computing is progressively gaining presence in the financial world. The number of real and potential applications is very large and, accordingly, so is the presence of applied research papers in the literature. The aim of this paper is both to present relevant application areas, and to serve as an introduction to the subject. This paper provides arguments that justify the growing interest in these techniques among the financial community and introduces domains of application such as stock and currency market prediction, trading, portfolio management, credit scoring or financial distress prediction areas.Publicad
    • …
    corecore