98 research outputs found
El mundo de las ciencias de la complejidad
La situación es verdaderamente apasionante. Mientras que en el mundo
llamado real –y entonces se hace referencia a dominios como la polÃtica, la
economÃa, los conflictos militares y sociales, por ejemplo–, la percepción
natural –digamos: de los medios y la opinión pública– es que el paÃs y el
mundo se encuentran en condiciones difÃciles; en algunos casos, dramática;
y en muchas ocasiones trágica, en el campo del progreso del conocimiento
asistimos a una magnÃfica vitalidad. Esta vitalidad se expresa en la ciencia de
punta y, notablemente, en las ciencias de la complejidad.
Mientras que la ciencia normal –para volver a la expresión de Kuhn– se
encuentra literalmente a la defensiva en numerosos campos, temas y problemas
–digamos, a la defensiva con respecto al decurso de los acontecimientos
y a las dinámicas del mundo contemporáneo–, en el contexto del estudio de
los sistemas complejos adaptativos asistimos a una vitalidad que es prácticamente
desconocida para la corriente principal de académicos –independientemente
de los niveles en los que trabajan–, de cientÃficos, de administradores
de educación y de ciencia y tecnologÃa (por ejemplo rectores, vicerrectores,
decanos, directores de departamentos, tomadores de decisión, polÃticos y gobernantes).
La corriente principal del conocimiento (mainstream) desconoce
una circunstancia, un proceso, una dinámica que sà es conocida por parte de
quienes trabajan e investigan activamente en el campo de las ciencias de la
complejidad
Neuromorphic Engineering Editors' Pick 2021
This collection showcases well-received spontaneous articles from the past couple of years, which have been specially handpicked by our Chief Editors, Profs. André van Schaik and Bernabé Linares-Barranco. The work presented here highlights the broad diversity of research performed across the section and aims to put a spotlight on the main areas of interest. All research presented here displays strong advances in theory, experiment, and methodology with applications to compelling problems. This collection aims to further support Frontiers’ strong community by recognizing highly deserving authors
A Survey on Reservoir Computing and its Interdisciplinary Applications Beyond Traditional Machine Learning
Reservoir computing (RC), first applied to temporal signal processing, is a
recurrent neural network in which neurons are randomly connected. Once
initialized, the connection strengths remain unchanged. Such a simple structure
turns RC into a non-linear dynamical system that maps low-dimensional inputs
into a high-dimensional space. The model's rich dynamics, linear separability,
and memory capacity then enable a simple linear readout to generate adequate
responses for various applications. RC spans areas far beyond machine learning,
since it has been shown that the complex dynamics can be realized in various
physical hardware implementations and biological devices. This yields greater
flexibility and shorter computation time. Moreover, the neuronal responses
triggered by the model's dynamics shed light on understanding brain mechanisms
that also exploit similar dynamical processes. While the literature on RC is
vast and fragmented, here we conduct a unified review of RC's recent
developments from machine learning to physics, biology, and neuroscience. We
first review the early RC models, and then survey the state-of-the-art models
and their applications. We further introduce studies on modeling the brain's
mechanisms by RC. Finally, we offer new perspectives on RC development,
including reservoir design, coding frameworks unification, physical RC
implementations, and interaction between RC, cognitive neuroscience and
evolution.Comment: 51 pages, 19 figures, IEEE Acces
Simulation and Design of Biological and Biologically-Motivated Computing Systems
In life science, there is a great need in understandings of biological systems for
therapeutics, synthetic biology, and biomedical applications. However, complex behaviors
and dynamics of biological systems are hard to understand and design. In
the mean time, the design of traditional computer architectures faces challenges from
power consumption, device reliability, and process variations. In recent years, the
convergence of computer science, computer engineering and life science has enabled
new applications targeting the challenges from both engineering and biological fields.
On one hand, computer modeling and simulation provides quantitative analysis and
predictions of functions and behaviors of biological systems, and further facilitates
the design of synthetic biological systems. On the other hand, bio-inspired devices
and systems are designed for real world applications by mimicking biological functions
and behaviors. This dissertation develops techniques for modeling and analyzing
dynamic behaviors of biologically realistic genetic circuits and brain models
and design of brain-inspired computing systems. The stability of genetic memory
circuits is studied to understand its functions for its potential applications in synthetic
biology. Based on the electrical-equivalent models of biochemical reactions,
simulation techniques widely used for electronic systems are applied to provide quantitative
analysis capabilities. In particular, system-theoretical techniques are used
to study the dynamic behaviors of genetic memory circuits, where the notion of
stability boundary is employed to characterize the bistability of such circuits. To
facilitate the simulation-based studies of physiological and pathological behaviors in
brain disorders, we construct large-scale brain models with detailed cellular mechanisms.
By developing dedicated numerical techniques for brain simulation, the simulation speed is greatly improved such that dynamic simulation of large thalamocortical
models with more than one million multi-compartment neurons and
hundreds of synapses on commodity computer servers becomes feasible. Simulation
of such large model produces biologically meaningful results demonstrating the emergence
of sigma and delta waves in the early and deep stages of sleep, and suggesting
the underlying cellular mechanisms that may be responsible for generation of absence
seizure. Brain-inspired computing paradigms may offer promising solutions
to many challenges facing the main stream Von Neumann computer architecture.
To this end, we develop a biologically inspired learning system amenable to VLSI
implementation. The proposed solution consists of a digitized liquid state machine
(LSM) and a spike-based learning rule, providing a fully biologically inspired learning
paradigm. The key design parameters of this liquid state machine are optimized
to maximize the learning performance while considering hardware implementation
cost. When applied to speech recognition of isolated word using TI46 speech corpus,
the performance of the proposed LSM rivals several existing state-of-art techniques
including the Hidden Markov Model based recognizer Sphinx-4
Artificial Neurogenesis: An Introduction and Selective Review
International audienceIn this introduction and review—like in the book which follows—we explore the hypothesis that adaptive growth is a means of producing brain-like machines. The emulation of neural development can incorporate desirable characteristics of natural neural systems into engineered designs. The introduction begins with a review of neural development and neural models. Next, artificial development— the use of a developmentally-inspired stage in engineering design—is introduced. Several strategies for performing this " meta-design " for artificial neural systems are reviewed. This work is divided into three main categories: bio-inspired representations ; developmental systems; and epigenetic simulations. Several specific network biases and their benefits to neural network design are identified in these contexts. In particular, several recent studies show a strong synergy, sometimes interchange-ability, between developmental and epigenetic processes—a topic that has remained largely under-explored in the literature
2022 roadmap on neuromorphic computing and engineering
Modern computation based on von Neumann architecture is now a mature cutting-edge science. In the von Neumann architecture, processing and memory units are implemented as separate blocks interchanging data intensively and continuously. This data transfer is responsible for a large part of the power consumption. The next generation computer technology is expected to solve problems at the exascale with 10 calculations each second. Even though these future computers will be incredibly powerful, if they are based on von Neumann type architectures, they will consume between 20 and 30 megawatts of power and will not have intrinsic physically built-in capabilities to learn or deal with complex data as our brain does. These needs can be addressed by neuromorphic computing systems which are inspired by the biological concepts of the human brain. This new generation of computers has the potential to be used for the storage and processing of large amounts of digital information with much lower power consumption than conventional processors. Among their potential future applications, an important niche is moving the control from data centers to edge devices. The aim of this roadmap is to present a snapshot of the present state of neuromorphic technology and provide an opinion on the challenges and opportunities that the future holds in the major areas of neuromorphic technology, namely materials, devices, neuromorphic circuits, neuromorphic algorithms, applications, and ethics. The roadmap is a collection of perspectives where leading researchers in the neuromorphic community provide their own view about the current state and the future challenges for each research area. We hope that this roadmap will be a useful resource by providing a concise yet comprehensive introduction to readers outside this field, for those who are just entering the field, as well as providing future perspectives for those who are well established in the neuromorphic computing community
Reservoir Computing with Neuro-memristive Nanowire Networks
We present simulation results based on a model of self–assembled nanowire networks with memristive junctions and neural network–like topology. We analyse the dynamical voltage distribution in response to an applied bias and explain the network conductance fluctuations observed in previous experimental studies. We show I − V curves under AC stimulation and compare these to other bulk memristors. We then study the capacity of these nanowire networks for neuro-inspired reservoir computing by demonstrating higher harmonic generation and short/long–term memory. Benchmark tasks in a reservoir computing framework are implemented. The tasks include nonlinear wave transformation, wave auto-generation, and hand-written digit classification
- …