59,095 research outputs found
Lifelong Learning of Spatiotemporal Representations with Dual-Memory Recurrent Self-Organization
Artificial autonomous agents and robots interacting in complex environments
are required to continually acquire and fine-tune knowledge over sustained
periods of time. The ability to learn from continuous streams of information is
referred to as lifelong learning and represents a long-standing challenge for
neural network models due to catastrophic forgetting. Computational models of
lifelong learning typically alleviate catastrophic forgetting in experimental
scenarios with given datasets of static images and limited complexity, thereby
differing significantly from the conditions artificial agents are exposed to.
In more natural settings, sequential information may become progressively
available over time and access to previous experience may be restricted. In
this paper, we propose a dual-memory self-organizing architecture for lifelong
learning scenarios. The architecture comprises two growing recurrent networks
with the complementary tasks of learning object instances (episodic memory) and
categories (semantic memory). Both growing networks can expand in response to
novel sensory experience: the episodic memory learns fine-grained
spatiotemporal representations of object instances in an unsupervised fashion
while the semantic memory uses task-relevant signals to regulate structural
plasticity levels and develop more compact representations from episodic
experience. For the consolidation of knowledge in the absence of external
sensory input, the episodic memory periodically replays trajectories of neural
reactivations. We evaluate the proposed model on the CORe50 benchmark dataset
for continuous object recognition, showing that we significantly outperform
current methods of lifelong learning in three different incremental learning
scenario
Damage to Association Fiber Tracts Impairs Recognition of the Facial Expression of Emotion
An array of cortical and subcortical structures have been implicated in the recognition of emotion from facial expressions. It remains unknown how these regions communicate as parts of a system to achieve recognition, but white matter tracts are likely critical to this process. We hypothesized that (1) damage to white matter tracts would be associated with recognition impairment and (2) the degree of disconnection of association fiber tracts [inferior longitudinal fasciculus (ILF) and/or inferior fronto-occipital fasciculus (IFOF)] connecting the visual cortex with emotion-related regions would negatively correlate with recognition performance. One hundred three patients with focal, stable brain lesions mapped onto a reference brain were tested on their recognition of six basic emotional facial expressions. Association fiber tracts from a probabilistic atlas were coregistered to the reference brain. Parameters estimating disconnection were entered in a general linear model to predict emotion recognition impairments, accounting for lesion size and cortical damage. Damage associated with the right IFOF significantly predicted an overall facial emotion recognition impairment and specific impairments for sadness, anger, and fear. One subject had a pure white matter lesion in the location of the right IFOF and ILF. He presented specific, unequivocal emotion recognition impairments. Additional analysis suggested that impairment in fear recognition can result from damage to the IFOF and not the amygdala. Our findings demonstrate the key role of white matter association tracts in the recognition of the facial expression of emotion and identify specific tracts that may be most critical
- …