4,007 research outputs found
Neuroethology, Computational
Over the past decade, a number of neural network researchers have used the term computational neuroethology to describe a specific approach to neuroethology. Neuroethology is the study of the neural mechanisms underlying the generation of behavior in animals, and hence it lies at the intersection of neuroscience (the study of nervous systems) and ethology (the study of animal behavior); for an introduction to neuroethology, see Simmons and Young (1999). The definition of computational neuroethology is very similar, but is not quite so dependent on studying animals: animals just happen to be biological autonomous agents. But there are also non-biological autonomous agents such as some types of robots, and some types of simulated embodied agents operating in virtual worlds. In this context, autonomous agents are self-governing entities capable of operating (i.e., coordinating perception and action) for extended periods of time in environments that are complex, uncertain, and dynamic. Thus, computational neuroethology can be characterised as the attempt to analyze the computational principles underlying the generation of behavior in animals and in artificial autonomous agents
Recommended from our members
Estimation of brain dynamics under visuomotor task using functional connectivity analysis based on graph theory
Network studies of brain connectivity have demonstrated that the highly connected area, or hub, is a vital feature of human functional and structural brain organization. Hubs identify which region plays an important role in cognitive/sensorimotor tasks. In addition, a complex visuomotor learning skill causes specific changes of neuronal activation across brain regions. Accordingly, this study utilizes the hub as one of the features to map the visuomotor learning tasks and their dynamic functional connectivity (dFC). The electroencephalogram (EEG) data recorded under three different behavior conditions were investigated: motion only (MO), vision only (VO), and tracking (Tra) conditions. Here, we used the phase locking value (PLV) with a sliding window (50 ms) to calculate the dFC at four distinct frequency bands: 8-12 Hz (alpha), 18-22 Hz (low beta), 26-30 Hz (high beta) and 38-42 Hz (gamma), and the eigenvector centrality to evaluate the hub identification. The Gaussian Mixture Model (GMM) was applied to investigate the dFC patterns. The results showed that the dFC patterns with the hub feature represent the characteristic of neuronal activities under visuomotor coordination
Middleware platform for distributed applications incorporating robots, sensors and the cloud
Cyber-physical systems in the factory of the future
will consist of cloud-hosted software governing an agile
production process executed by autonomous mobile robots
and controlled by analyzing the data from a vast number of
sensors. CPSs thus operate on a distributed production floor
infrastructure and the set-up continuously changes with each
new manufacturing task. In this paper, we present our OSGibased
middleware that abstracts the deployment of servicebased
CPS software components on the underlying distributed
platform comprising robots, actuators, sensors and the cloud.
Moreover, our middleware provides specific support to develop
components based on artificial neural networks, a technique that
recently became very popular for sensor data analytics and robot
actuation. We demonstrate a system where a robot takes actions
based on the input from sensors in its vicinity
- …