3 research outputs found

    Representing Where along with What Information in a Model of a Cortical Patch

    Get PDF
    Behaving in the real world requires flexibly combining and maintaining information about both continuous and discrete variables. In the visual domain, several lines of evidence show that neurons in some cortical networks can simultaneously represent information about the position and identity of objects, and maintain this combined representation when the object is no longer present. The underlying network mechanism for this combined representation is, however, unknown. In this paper, we approach this issue through a theoretical analysis of recurrent networks. We present a model of a cortical network that can retrieve information about the identity of objects from incomplete transient cues, while simultaneously representing their spatial position. Our results show that two factors are important in making this possible: A) a metric organisation of the recurrent connections, and B) a spatially localised change in the linear gain of neurons. Metric connectivity enables a localised retrieval of information about object identity, while gain modulation ensures localisation in the correct position. Importantly, we find that the amount of information that the network can retrieve and retain about identity is strongly affected by the amount of information it maintains about position. This balance can be controlled by global signals that change the neuronal gain. These results show that anatomical and physiological properties, which have long been known to characterise cortical networks, naturally endow them with the ability to maintain a conjunctive representation of the identity and location of objects

    ForeNet: fourier recurrent neural networks for time series prediction.

    Get PDF
    Ying-Qian Zhang.Thesis (M.Phil.)--Chinese University of Hong Kong, 2001.Includes bibliographical references (leaves 115-124).Abstracts in English and Chinese.Abstract --- p.iAcknowledgement --- p.iiiChapter 1 --- Introduction --- p.1Chapter 1.1 --- Background --- p.1Chapter 1.2 --- Objective --- p.2Chapter 1.3 --- Contributions --- p.3Chapter 1.4 --- Thesis Overview --- p.4Chapter 2 --- Literature Review --- p.6Chapter 2.1 --- Takens' Theorem --- p.6Chapter 2.2 --- Linear Models for Prediction --- p.7Chapter 2.2.1 --- Autoregressive Model --- p.7Chapter 2.2.2 --- Moving Average Model --- p.8Chapter 2.2.3 --- Autoregressive-moving Average Model --- p.9Chapter 2.2.4 --- Fitting a Linear Model to a Given Time Series --- p.9Chapter 2.2.5 --- State-space Reconstruction --- p.10Chapter 2.3 --- Neural Network Models for Time Series Processing --- p.11Chapter 2.3.1 --- Feed-forward Neural Networks --- p.11Chapter 2.3.2 --- Recurrent Neural Networks --- p.14Chapter 2.3.3 --- Training Algorithms for Recurrent Networks --- p.18Chapter 2.4 --- Combining Neural Networks and other approximation techniques --- p.22Chapter 3 --- ForeNet: Model and Representation --- p.24Chapter 3.1 --- Fourier Recursive Prediction Equation --- p.24Chapter 3.1.1 --- Fourier Analysis of Time Series --- p.25Chapter 3.1.2 --- Recursive Form --- p.25Chapter 3.2 --- Fourier Recurrent Neural Network Model (ForeNet) --- p.27Chapter 3.2.1 --- Neural Networks Representation --- p.28Chapter 3.2.2 --- Architecture of ForeNet --- p.29Chapter 4 --- ForeNet: Implementation --- p.32Chapter 4.1 --- Improvement on ForeNet --- p.33Chapter 4.1.1 --- Number of Hidden Neurons --- p.33Chapter 4.1.2 --- Real-valued Outputs --- p.34Chapter 4.2 --- Parameters Initialization --- p.37Chapter 4.3 --- Application of ForeNet: the Process of Time Series Prediction --- p.38Chapter 4.4 --- Some Implications --- p.39Chapter 5 --- ForeNet: Initialization --- p.40Chapter 5.1 --- Unfolded Form of ForeNet --- p.40Chapter 5.2 --- Coefficients Analysis --- p.43Chapter 5.2.1 --- "Analysis of the Coefficients Set, vn " --- p.43Chapter 5.2.2 --- "Analysis of the Coefficients Set, ΞΌn(d) " --- p.44Chapter 5.3 --- Experiments of ForeNet Initialization --- p.47Chapter 5.3.1 --- Objective and Experiment Setting --- p.47Chapter 5.3.2 --- Prediction of Sunspot Series --- p.49Chapter 5.3.3 --- Prediction of Mackey-Glass Series --- p.53Chapter 5.3.4 --- Prediction of Laser Data --- p.56Chapter 5.3.5 --- Three More Series --- p.59Chapter 5.4 --- Some Implications on the Proposed Initialization Method --- p.63Chapter 6 --- ForeNet: Learning Algorithms --- p.67Chapter 6.1 --- Complex Real Time Recurrent Learning (CRTRL) --- p.68Chapter 6.2 --- Batch-mode Learning --- p.70Chapter 6.3 --- Time Complexity --- p.71Chapter 6.4 --- Property Analysis and Experimental Results --- p.72Chapter 6.4.1 --- Efficient initialization:compared with random initialization --- p.74Chapter 6.4.2 --- Complex-valued network:compared with real-valued net- work --- p.78Chapter 6.4.3 --- Simple architecture:compared with ring-structure RNN . --- p.79Chapter 6.4.4 --- Linear model: compared with nonlinear ForeNet --- p.80Chapter 6.4.5 --- Small number of hidden units --- p.88Chapter 6.5 --- Comparison with Some Other Models --- p.89Chapter 6.5.1 --- Comparison with AR model --- p.91Chapter 6.5.2 --- Comparison with TDNN Networks and FIR Networks . --- p.93Chapter 6.5.3 --- Comparison to a few more results --- p.94Chapter 6.6 --- Summarization --- p.95Chapter 7 --- Learning and Prediction: On-Line Training --- p.98Chapter 7.1 --- On-Line Learning Algorithm --- p.98Chapter 7.1.1 --- Advantages and Disadvantages --- p.98Chapter 7.1.2 --- Training Process --- p.99Chapter 7.2 --- Experiments --- p.101Chapter 7.3 --- Predicting Stock Time Series --- p.105Chapter 8 --- Discussions and Conclusions --- p.109Chapter 8.1 --- Limitations of ForeNet --- p.109Chapter 8.2 --- Advantages of ForeNet --- p.111Chapter 8.3 --- Future Works --- p.112Bibliography --- p.11
    corecore