910 research outputs found
A New Oscillating-Error Technique for Classifiers
This paper describes a new method for reducing the error in a classifier. It
uses an error correction update that includes the very simple rule of either
adding or subtracting the error adjustment, based on whether the variable value
is currently larger or smaller than the desired value. While a traditional
neuron would sum the inputs together and then apply a function to the total,
this new method can change the function decision for each input value. This
gives added flexibility to the convergence procedure, where through a series of
transpositions, variables that are far away can continue towards the desired
value, whereas variables that are originally much closer can oscillate from one
side to the other. Tests show that the method can successfully classify some
benchmark datasets. It can also work in a batch mode, with reduced training
times and can be used as part of a neural network architecture. Some
comparisons with an earlier wave shape paper are also made
Global and regional brain metabolic scaling and its functional consequences
Background: Information processing in the brain requires large amounts of
metabolic energy, the spatial distribution of which is highly heterogeneous
reflecting complex activity patterns in the mammalian brain.
Results: Here, it is found based on empirical data that, despite this
heterogeneity, the volume-specific cerebral glucose metabolic rate of many
different brain structures scales with brain volume with almost the same
exponent around -0.15. The exception is white matter, the metabolism of which
seems to scale with a standard specific exponent -1/4. The scaling exponents
for the total oxygen and glucose consumptions in the brain in relation to its
volume are identical and equal to , which is significantly larger
than the exponents 3/4 and 2/3 suggested for whole body basal metabolism on
body mass.
Conclusions: These findings show explicitly that in mammals (i)
volume-specific scaling exponents of the cerebral energy expenditure in
different brain parts are approximately constant (except brain stem
structures), and (ii) the total cerebral metabolic exponent against brain
volume is greater than the much-cited Kleiber's 3/4 exponent. The
neurophysiological factors that might account for the regional uniformity of
the exponents and for the excessive scaling of the total brain metabolism are
discussed, along with the relationship between brain metabolic scaling and
computation.Comment: Brain metabolism scales with its mass well above 3/4 exponen
Optimal modularity and memory capacity of neural reservoirs
The neural network is a powerful computing framework that has been exploited
by biological evolution and by humans for solving diverse problems. Although
the computational capabilities of neural networks are determined by their
structure, the current understanding of the relationships between a neural
network's architecture and function is still primitive. Here we reveal that
neural network's modular architecture plays a vital role in determining the
neural dynamics and memory performance of the network of threshold neurons. In
particular, we demonstrate that there exists an optimal modularity for memory
performance, where a balance between local cohesion and global connectivity is
established, allowing optimally modular networks to remember longer. Our
results suggest that insights from dynamical analysis of neural networks and
information spreading processes can be leveraged to better design neural
networks and may shed light on the brain's modular organization
The Technicity Paradigm and Scientism in Qualitative Research
This philosophical paper suggests that almost all academic research, including qualitative research, is conducted under the influence of a technicity paradigm which values objectivity, generalisability and rationality. This paper explores, from a Heideggerian perspective, the fundamental characteristics of research under the influence of technicity and discusses how these characteristics manifest in qualitative research. It includes a reflection on what qualitative research might be like if it could escape the influence of technicity and realise its potential for inclusive and relevant knowledge making
Spiking neural networks for computer vision
State-of-the-art computer vision systems use frame-based cameras that sample the visual scene as a series of high-resolution images. These are then processed using convolutional neural networks using neurons with continuous outputs. Biological vision systems use a quite different approach, where the eyes (cameras) sample the visual scene continuously, often with a non-uniform resolution, and generate neural spike events in response to changes in the scene. The resulting spatio-temporal patterns of events are then processed through networks of spiking neurons. Such event-based processing offers advantages in terms of focusing constrained resources on the most salient features of the perceived scene, and those advantages should also accrue to engineered vision systems based upon similar principles. Event-based vision sensors, and event-based processing exemplified by the SpiNNaker (Spiking Neural Network Architecture) machine, can be used to model the biological vision pathway at various levels of detail. Here we use this approach to explore structural synaptic plasticity as a possible mechanism whereby biological vision systems may learn the statistics of their inputs without supervision, pointing the way to engineered vision systems with similar online learning capabilities
Outlook Magazine, Autumn 2017
https://digitalcommons.wustl.edu/outlook/1202/thumbnail.jp
- …