384 research outputs found
A Theory of Cortical Neural Processing.
This dissertation puts forth an original theory of cortical neural processing that is unique in its view of the interplay of chaotic and stable oscillatory neurodynamics and is meant to stimulate new ideas in artificial neural network modeling. Our theory is the first to suggest two new purposes for chaotic neurodynamics: (i) as a natural means of representing the uncertainty in the outcome of performed tasks, such as memory retrieval or classification, and (ii) as an automatic way of producing an economic representation of distributed information. We developed new models, to better understand how the cerebral cortex processes information, which led to our theory. Common to these models is a neuron interaction function that alternates between excitatory and inhibitory neighborhoods. Our theory allows characteristics of the input environment to influence the structural development of the cortex. We view low intensity chaotic activity as the a priori uncertain base condition of the cortex, resulting from the interaction of a multitude of stronger potential responses. Data, distinguishing one response from many others, drives bifurcations back toward the direction of less complex (stable) behavior. Stability appears as temporary bubble-like clusters within the boundaries of cortical columns and begins to propagate through frequency sensitive and non-specific neurons. But this is limited by destabilizing long-path connections. An original model of the post-natal development of ocular dominance columns in the striate cortex is presented and compared to autoradiographic images from the literature with good matching results. Finally, experiments are shown to favor computed update order over traditional approaches for better performance of the pattern completion process
Complex and Adaptive Dynamical Systems: A Primer
An thorough introduction is given at an introductory level to the field of
quantitative complex system science, with special emphasis on emergence in
dynamical systems based on network topologies. Subjects treated include graph
theory and small-world networks, a generic introduction to the concepts of
dynamical system theory, random Boolean networks, cellular automata and
self-organized criticality, the statistical modeling of Darwinian evolution,
synchronization phenomena and an introduction to the theory of cognitive
systems.
It inludes chapter on Graph Theory and Small-World Networks, Chaos,
Bifurcations and Diffusion, Complexity and Information Theory, Random Boolean
Networks, Cellular Automata and Self-Organized Criticality, Darwinian
evolution, Hypercycles and Game Theory, Synchronization Phenomena and Elements
of Cognitive System Theory.Comment: unformatted version of the textbook; published in Springer,
Complexity Series (2008, second edition 2010
Complexity, Emergent Systems and Complex Biological Systems:\ud Complex Systems Theory and Biodynamics. [Edited book by I.C. Baianu, with listed contributors (2011)]
An overview is presented of System dynamics, the study of the behaviour of complex systems, Dynamical system in mathematics Dynamic programming in computer science and control theory, Complex systems biology, Neurodynamics and Psychodynamics.\u
Recommended from our members
Executive Attention, Action Selection and Attention-Based Learning in Neurally Controlled Autonomous Agents
I describe the design and implementation of an integrated neural architecture, modelled on human executive attention, which is used to control both automatic (reactive) and willed action selection in a simulated robot. The model, based upon Norman and Shallice's supervisory attention system, incorporates important features of human attentional control: selection of an intended task over a more salient automatic task; priming of future tasks that are anticipated; and appropriate levels of persistence of focus of attention. Recognising that attention-based learning, mediated by the limbic system, and the hippocampus in particular, plays an important role in adaptive learning, I extend the Norman and Shallice model, introducing an intrinsic, attention-based learning mechanism that enhances the automaticity of willed actions and reduces future need for attentional effort required for dealing with distractions. These enhanced features support a new level of attentional autonomy in the operation of the simulated robot. Some properties of the model are explored using lesion studies, leading to the identification of a correspondence between the behavioural pathologies of the simulated robot and those seen in human patients suffering dysfunction of executive attention
Optimal use of computing equipment in an automated industrial inspection context
This thesis deals with automatic defect detection. The objective was to develop the techniques required by a small manufacturing business to make cost-efficient use of inspection technology. In our work on inspection techniques we discuss image acquisition and the choice between custom and general-purpose processing hardware. We examine the classes of general-purpose computer available and study popular operating systems in detail. We highlight the advantages of a hybrid system interconnected via a local area network and develop a sophisticated suite of image-processing software based on it. We quantitatively study the performance of elements of the TCP/IP networking protocol suite and comment on appropriate protocol selection for parallel distributed applications. We implement our own distributed application based on these findings. In our work on inspection algorithms we investigate the potential uses of iterated function series and Fourier transform operators when preprocessing images of defects in aluminium plate acquired using a linescan camera. We employ a multi-layer perceptron neural network trained by backpropagation as a classifier. We examine the effect on the training process of the number of nodes in the hidden layer and the ability of the network to identify faults in images of aluminium plate. We investigate techniques for introducing positional independence into the network's behaviour. We analyse the pattern of weights induced in the network after training in order to gain insight into the logic of its internal representation. We conclude that the backpropagation training process is sufficiently computationally intensive so as to present a real barrier to further development in practical neural network techniques and seek ways to achieve a speed-up. Weconsider the training process as a search problem and arrive at a process involving multiple, parallel search "vectors" and aspects of genetic algorithms. We implement the system as the mentioned distributed application and comment on its performance
18th IEEE Workshop on Nonlinear Dynamics of Electronic Systems: Proceedings
Proceedings of the 18th IEEE Workshop on Nonlinear Dynamics of Electronic Systems, which took place in Dresden, Germany, 26 – 28 May 2010.:Welcome Address ........................ Page I
Table of Contents ........................ Page III
Symposium Committees .............. Page IV
Special Thanks ............................. Page V
Conference program (incl. page numbers of papers)
................... Page VI
Conference papers
Invited talks ................................ Page 1
Regular Papers ........................... Page 14
Wednesday, May 26th, 2010 ......... Page 15
Thursday, May 27th, 2010 .......... Page 110
Friday, May 28th, 2010 ............... Page 210
Author index ............................... Page XII
- …