163,628 research outputs found

    Attractor neural networks storing multiple space representations: a model for hippocampal place fields

    Full text link
    A recurrent neural network model storing multiple spatial maps, or ``charts'', is analyzed. A network of this type has been suggested as a model for the origin of place cells in the hippocampus of rodents. The extremely diluted and fully connected limits are studied, and the storage capacity and the information capacity are found. The important parameters determining the performance of the network are the sparsity of the spatial representations and the degree of connectivity, as found already for the storage of individual memory patterns in the general theory of auto-associative networks. Such results suggest a quantitative parallel between theories of hippocampal function in different animal species, such as primates (episodic memory) and rodents (memory for space).Comment: 19 RevTeX pages, 8 pes figure

    Long Short-Term Memory Spatial Transformer Network

    Full text link
    Spatial transformer network has been used in a layered form in conjunction with a convolutional network to enable the model to transform data spatially. In this paper, we propose a combined spatial transformer network (STN) and a Long Short-Term Memory network (LSTM) to classify digits in sequences formed by MINST elements. This LSTM-STN model has a top-down attention mechanism profit from LSTM layer, so that the STN layer can perform short-term independent elements for the statement in the process of spatial transformation, thus avoiding the distortion that may be caused when the entire sequence is spatially transformed. It also avoids the influence of this distortion on the subsequent classification process using convolutional neural networks and achieves a single digit error of 1.6\% compared with 2.2\% of Convolutional Neural Network with STN layer

    Dynamical model of sequential spatial memory: winnerless competition of patterns

    Full text link
    We introduce a new biologically-motivated model of sequential spatial memory which is based on the principle of winnerless competition (WLC). We implement this mechanism in a two-layer neural network structure and present the learning dynamics which leads to the formation of a WLC network. After learning, the system is capable of associative retrieval of pre-recorded sequences of spatial patterns.Comment: 4 pages, submitted to PR

    A Deep Spatio-Temporal Fuzzy Neural Network for Passenger Demand Prediction

    Get PDF
    In spite of its importance, passenger demand prediction is a highly challenging problem, because the demand is simultaneously influenced by the complex interactions among many spatial and temporal factors and other external factors such as weather. To address this problem, we propose a Spatio-TEmporal Fuzzy neural Network (STEF-Net) to accurately predict passenger demands incorporating the complex interactions of all known important factors. We design an end-to-end learning framework with different neural networks modeling different factors. Specifically, we propose to capture spatio-temporal feature interactions via a convolutional long short-term memory network and model external factors via a fuzzy neural network that handles data uncertainty significantly better than deterministic methods. To keep the temporal relations when fusing two networks and emphasize discriminative spatio-temporal feature interactions, we employ a novel feature fusion method with a convolution operation and an attention layer. As far as we know, our work is the first to fuse a deep recurrent neural network and a fuzzy neural network to model complex spatial-temporal feature interactions with additional uncertain input features for predictive learning. Experiments on a large-scale real-world dataset show that our model achieves more than 10% improvement over the state-of-the-art approaches.Comment: https://epubs.siam.org/doi/abs/10.1137/1.9781611975673.1
    • …
    corecore