868 research outputs found
Fractals in the Nervous System: conceptual Implications for Theoretical Neuroscience
This essay is presented with two principal objectives in mind: first, to
document the prevalence of fractals at all levels of the nervous system, giving
credence to the notion of their functional relevance; and second, to draw
attention to the as yet still unresolved issues of the detailed relationships
among power law scaling, self-similarity, and self-organized criticality. As
regards criticality, I will document that it has become a pivotal reference
point in Neurodynamics. Furthermore, I will emphasize the not yet fully
appreciated significance of allometric control processes. For dynamic fractals,
I will assemble reasons for attributing to them the capacity to adapt task
execution to contextual changes across a range of scales. The final Section
consists of general reflections on the implications of the reviewed data, and
identifies what appear to be issues of fundamental importance for future
research in the rapidly evolving topic of this review
What is life? A perspective of the mathematical kinetic theory of active particles
The modeling of living systems composed of many interacting entities is treated in this paper with the aim of describing their collective behaviors. The mathematical approach is developed within the general framework of the kinetic theory of active particles. The presentation is in three parts. First, we derive the mathematical tools, subsequently, we show how the method can be applied to a number of case studies related to well defined living systems, and finally, we look ahead to research perspectives
Attention in Natural Language Processing
Attention is an increasingly popular mechanism used in a wide range of neural architectures. The mechanism itself has been realized in a variety of formats. However, because of the fast-paced advances in this domain, a systematic overview of attention is still missing. In this article, we define a unified model for attention architectures in natural language processing, with a focus on those designed to work with vector representations of the textual data. We propose a taxonomy of attention models according to four dimensions: the representation of the input, the compatibility function, the distribution function, and the multiplicity of the input and/or output. We present the examples of how prior information can be exploited in attention models and discuss ongoing research efforts and open challenges in the area, providing the first extensive categorization of the vast body of literature in this exciting domain
Perspectives on adaptive dynamical systems
Adaptivity is a dynamical feature that is omnipresent in nature,
socio-economics, and technology. For example, adaptive couplings appear in
various real-world systems like the power grid, social, and neural networks,
and they form the backbone of closed-loop control strategies and machine
learning algorithms. In this article, we provide an interdisciplinary
perspective on adaptive systems. We reflect on the notion and terminology of
adaptivity in different disciplines and discuss which role adaptivity plays for
various fields. We highlight common open challenges, and give perspectives on
future research directions, looking to inspire interdisciplinary approaches.Comment: 46 pages, 9 figure
Memristors for the Curious Outsiders
We present both an overview and a perspective of recent experimental advances
and proposed new approaches to performing computation using memristors. A
memristor is a 2-terminal passive component with a dynamic resistance depending
on an internal parameter. We provide an brief historical introduction, as well
as an overview over the physical mechanism that lead to memristive behavior.
This review is meant to guide nonpractitioners in the field of memristive
circuits and their connection to machine learning and neural computation.Comment: Perpective paper for MDPI Technologies; 43 page
- …