4,599 research outputs found
Winner-relaxing and winner-enhancing Kohonen maps: Maximal mutual information from enhancing the winner
The magnification behaviour of a generalized family of self-organizing
feature maps, the Winner Relaxing and Winner Enhancing Kohonen algorithms is
analyzed by the magnification law in the one-dimensional case, which can be
obtained analytically. The Winner-Enhancing case allows to acheive a
magnification exponent of one and therefore provides optimal mapping in the
sense of information theory. A numerical verification of the magnification law
is included, and the ordering behaviour is analyzed. Compared to the original
Self-Organizing Map and some other approaches, the generalized Winner Enforcing
Algorithm requires minimal extra computations per learning step and is
conveniently easy to implement.Comment: 6 pages, 5 figures. For an extended version refer to cond-mat/0208414
(Neural Computation 17, 996-1009
Mathematical Foundations of Consciousness
We employ the Zermelo-Fraenkel Axioms that characterize sets as mathematical
primitives. The Anti-foundation Axiom plays a significant role in our
development, since among other of its features, its replacement for the Axiom
of Foundation in the Zermelo-Fraenkel Axioms motivates Platonic
interpretations. These interpretations also depend on such allied notions for
sets as pictures, graphs, decorations, labelings and various mappings that we
use. A syntax and semantics of operators acting on sets is developed. Such
features enable construction of a theory of non-well-founded sets that we use
to frame mathematical foundations of consciousness. To do this we introduce a
supplementary axiomatic system that characterizes experience and consciousness
as primitives. The new axioms proceed through characterization of so- called
consciousness operators. The Russell operator plays a central role and is shown
to be one example of a consciousness operator. Neural networks supply striking
examples of non-well-founded graphs the decorations of which generate
associated sets, each with a Platonic aspect. Employing our foundations, we
show how the supervening of consciousness on its neural correlates in the brain
enables the framing of a theory of consciousness by applying appropriate
consciousness operators to the generated sets in question
Tailored ensembles of neural networks optimize sensitivity to stimulus statistics
The dynamic range of stimulus processing in living organisms is much larger
than a single neural network can explain. For a generic, tunable spiking
network we derive that while the dynamic range is maximal at criticality, the
interval of discriminable intensities is very similar for any network tuning
due to coalescence. Compensating coalescence enables adaptation of
discriminable intervals. Thus, we can tailor an ensemble of networks optimized
to the distribution of stimulus intensities, e.g., extending the dynamic range
arbitrarily. We discuss potential applications in machine learning.Comment: 6 pages plus supplemental materia
Self-organization without conservation: Are neuronal avalanches generically critical?
Recent experiments on cortical neural networks have revealed the existence of
well-defined avalanches of electrical activity. Such avalanches have been
claimed to be generically scale-invariant -- i.e. power-law distributed -- with
many exciting implications in Neuroscience. Recently, a self-organized model
has been proposed by Levina, Herrmann and Geisel to justify such an empirical
finding. Given that (i) neural dynamics is dissipative and (ii) there is a
loading mechanism "charging" progressively the background synaptic strength,
this model/dynamics is very similar in spirit to forest-fire and earthquake
models, archetypical examples of non-conserving self-organization, which have
been recently shown to lack true criticality. Here we show that cortical neural
networks obeying (i) and (ii) are not generically critical; unless parameters
are fine tuned, their dynamics is either sub- or super-critical, even if the
pseudo-critical region is relatively broad. This conclusion seems to be in
agreement with the most recent experimental observations. The main implication
of our work is that, if future experimental research on cortical networks were
to support that truly critical avalanches are the norm and not the exception,
then one should look for more elaborate (adaptive/evolutionary) explanations,
beyond simple self-organization, to account for this.Comment: 28 pages, 11 figures, regular pape
Short-term Demand Forecasting for Online Car-hailing Services using Recurrent Neural Networks
Short-term traffic flow prediction is one of the crucial issues in
intelligent transportation system, which is an important part of smart cities.
Accurate predictions can enable both the drivers and the passengers to make
better decisions about their travel route, departure time and travel origin
selection, which can be helpful in traffic management. Multiple models and
algorithms based on time series prediction and machine learning were applied to
this issue and achieved acceptable results. Recently, the availability of
sufficient data and computational power, motivates us to improve the prediction
accuracy via deep-learning approaches. Recurrent neural networks have become
one of the most popular methods for time series forecasting, however, due to
the variety of these networks, the question that which type is the most
appropriate one for this task remains unsolved. In this paper, we use three
kinds of recurrent neural networks including simple RNN units, GRU and LSTM
neural network to predict short-term traffic flow. The dataset from TAP30
Corporation is used for building the models and comparing RNNs with several
well-known models, such as DEMA, LASSO and XGBoost. The results show that all
three types of RNNs outperform the others, however, more simple RNNs such as
simple recurrent units and GRU perform work better than LSTM in terms of
accuracy and training time.Comment: arXiv admin note: text overlap with arXiv:1706.06279,
arXiv:1804.04176 by other author
Proceedings of Abstracts Engineering and Computer Science Research Conference 2019
© 2019 The Author(s). This is an open-access work distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. For further details please see https://creativecommons.org/licenses/by/4.0/. Note: Keynote: Fluorescence visualisation to evaluate effectiveness of personal protective equipment for infection control is © 2019 Crown copyright and so is licensed under the Open Government Licence v3.0. Under this licence users are permitted to copy, publish, distribute and transmit the Information; adapt the Information; exploit the Information commercially and non-commercially for example, by combining it with other Information, or by including it in your own product or application. Where you do any of the above you must acknowledge the source of the Information in your product or application by including or linking to any attribution statement specified by the Information Provider(s) and, where possible, provide a link to this licence: http://www.nationalarchives.gov.uk/doc/open-government-licence/version/3/This book is the record of abstracts submitted and accepted for presentation at the Inaugural Engineering and Computer Science Research Conference held 17th April 2019 at the University of Hertfordshire, Hatfield, UK. This conference is a local event aiming at bringing together the research students, staff and eminent external guests to celebrate Engineering and Computer Science Research at the University of Hertfordshire. The ECS Research Conference aims to showcase the broad landscape of research taking place in the School of Engineering and Computer Science. The 2019 conference was articulated around three topical cross-disciplinary themes: Make and Preserve the Future; Connect the People and Cities; and Protect and Care
Fast Inference of Interactions in Assemblies of Stochastic Integrate-and-Fire Neurons from Spike Recordings
We present two Bayesian procedures to infer the interactions and external
currents in an assembly of stochastic integrate-and-fire neurons from the
recording of their spiking activity. The first procedure is based on the exact
calculation of the most likely time courses of the neuron membrane potentials
conditioned by the recorded spikes, and is exact for a vanishing noise variance
and for an instantaneous synaptic integration. The second procedure takes into
account the presence of fluctuations around the most likely time courses of the
potentials, and can deal with moderate noise levels. The running time of both
procedures is proportional to the number S of spikes multiplied by the squared
number N of neurons. The algorithms are validated on synthetic data generated
by networks with known couplings and currents. We also reanalyze previously
published recordings of the activity of the salamander retina (including from
32 to 40 neurons, and from 65,000 to 170,000 spikes). We study the dependence
of the inferred interactions on the membrane leaking time; the differences and
similarities with the classical cross-correlation analysis are discussed.Comment: Accepted for publication in J. Comput. Neurosci. (dec 2010
The Decision Value Computations in the vmPFC and Striatum Use a Relative Value Code That is Guided by Visual Attention
There is a growing consensus in behavioral neuroscience that the brain makes simple choices by first assigning a value to the options under consideration and then comparing them. Two important open questions are whether the brain encodes absolute or relative value signals, and what role attention might play in these computations.Weinvestigated these questions using a human fMRI experiment with
a binary choice task in which the fixations to both stimuli were exogenously manipulated to control for the role of visual attention in the valuation computation. We found that the ventromedial prefrontal cortex and the ventral striatum encoded fixation-dependent relative value signals: activity in these areas correlated with the difference in value between the attended and the unattended items. These attention-modulated relative value signals might serve as the input of a comparator system that is used to make a choice
A Functional Architecture Approach to Neural Systems
The technology for the design of systems to perform extremely complex combinations of real-time functionality has developed over a long period. This technology is based on the use of a hardware architecture with a physical separation into memory and processing, and a software architecture which divides functionality into a disciplined hierarchy of software components which exchange unambiguous information. This technology experiences difficulty in design of systems to perform parallel processing, and extreme difficulty in design of systems which can heuristically change their own functionality. These limitations derive from the approach to information exchange between functional components. A design approach in which functional components can exchange ambiguous information leads to systems with the recommendation architecture which are less subject to these limitations. Biological brains have been constrained by natural pressures to adopt functional architectures with this different information exchange approach. Neural networks have not made a complete shift to use of ambiguous information, and do not address adequate management of context for ambiguous information exchange between modules. As a result such networks cannot be scaled to complex functionality. Simulations of systems with the recommendation architecture demonstrate the capability to heuristically organize to perform complex functionality
- …