55,820 research outputs found
Half a billion simulations: evolutionary algorithms and distributed computing for calibrating the SimpopLocal geographical model
Multi-agent geographical models integrate very large numbers of spatial
interactions. In order to validate those models large amount of computing is
necessary for their simulation and calibration. Here a new data processing
chain including an automated calibration procedure is experimented on a
computational grid using evolutionary algorithms. This is applied for the first
time to a geographical model designed to simulate the evolution of an early
urban settlement system. The method enables us to reduce the computing time and
provides robust results. Using this method, we identify several parameter
settings that minimise three objective functions that quantify how closely the
model results match a reference pattern. As the values of each parameter in
different settings are very close, this estimation considerably reduces the
initial possible domain of variation of the parameters. The model is thus a
useful tool for further multiple applications on empirical historical
situations
Visualization in spatial modeling
This chapter deals with issues arising from a central theme in contemporary computer modeling - visualization. We first tie visualization to varieties of modeling along the continuum from iconic to symbolic and then focus on the notion that our models are so intrinsically complex that there are many different types of visualization that might be developed in their understanding and implementation. This focuses the debate on the very way of 'doing science' in that patterns and processes of any complexity can be better understood through visualizing the data, the simulations, and the outcomes that such models generate. As we have grown more sensitive to the problem of complexity in all systems, we are more aware that the twin goals of parsimony and verifiability which have dominated scientific theory since the 'Enlightenment' are up for grabs: good theories and models must 'look right' despite what our statistics and causal logics tell us. Visualization is the cutting edge of this new way of thinking about science but its styles vary enormously with context. Here we define three varieties: visualization of complicated systems to make things simple or at least explicable, which is the role of pedagogy; visualization to explore unanticipated outcomes and to refine processes that interact in unanticipated ways; and visualization to enable end users with no prior understanding of the science but a deep understanding of the problem to engage in using models for prediction, prescription, and control. We illustrate these themes with a model of an agricultural market which is the basis of modern urban economics - the von Thünen model of land rent and density; a model of urban development based on interacting spatial and temporal processes of land development - the DUEM model; and a pedestrian model of human movement at the fine scale where control of such movements to meet standards of public safety is intrinsically part of the model about which the controllers know intimately. © Springer-Verlag Berlin Heidelberg 2006
Demonstrating Advantages of Neuromorphic Computation: A Pilot Study
Neuromorphic devices represent an attempt to mimic aspects of the brain's
architecture and dynamics with the aim of replicating its hallmark functional
capabilities in terms of computational power, robust learning and energy
efficiency. We employ a single-chip prototype of the BrainScaleS 2 neuromorphic
system to implement a proof-of-concept demonstration of reward-modulated
spike-timing-dependent plasticity in a spiking network that learns to play the
Pong video game by smooth pursuit. This system combines an electronic
mixed-signal substrate for emulating neuron and synapse dynamics with an
embedded digital processor for on-chip learning, which in this work also serves
to simulate the virtual environment and learning agent. The analog emulation of
neuronal membrane dynamics enables a 1000-fold acceleration with respect to
biological real-time, with the entire chip operating on a power budget of 57mW.
Compared to an equivalent simulation using state-of-the-art software, the
on-chip emulation is at least one order of magnitude faster and three orders of
magnitude more energy-efficient. We demonstrate how on-chip learning can
mitigate the effects of fixed-pattern noise, which is unavoidable in analog
substrates, while making use of temporal variability for action exploration.
Learning compensates imperfections of the physical substrate, as manifested in
neuronal parameter variability, by adapting synaptic weights to match
respective excitability of individual neurons.Comment: Added measurements with noise in NEST simulation, add notice about
journal publication. Frontiers in Neuromorphic Engineering (2019
The Perceptual Experience Of Slope By Foot And By Finger
Historically, the bodily senses have often been regarded as impeccable sources of spatial information and as being the teacher of vision. Here, the authors report that the haptic perception of slope by means of the foot is greatly exaggerated. The exaggeration is present in verbal as well as proprioceptive judgments. It is shown that this misperception of pedal slope is not caused by calibration to the well-established visual misperception of slope because it is present in congenitally blind individuals as well. The pedal misperception of slope is contrasted with the perception of slope by dynamic touch with a finger in a force-feedback device. Although slopes feel slightly exaggerated even when explored by finger, they tend to show much less exaggeration than when equivalent slopes are stood on. The results are discussed in terms of a theory of coding efficiency. (PsycINFO Database Record (c) 2013 APA, all rights reserved)(journal abstract
Past, Present, and Future of Simultaneous Localization And Mapping: Towards the Robust-Perception Age
Simultaneous Localization and Mapping (SLAM)consists in the concurrent
construction of a model of the environment (the map), and the estimation of the
state of the robot moving within it. The SLAM community has made astonishing
progress over the last 30 years, enabling large-scale real-world applications,
and witnessing a steady transition of this technology to industry. We survey
the current state of SLAM. We start by presenting what is now the de-facto
standard formulation for SLAM. We then review related work, covering a broad
set of topics including robustness and scalability in long-term mapping, metric
and semantic representations for mapping, theoretical performance guarantees,
active SLAM and exploration, and other new frontiers. This paper simultaneously
serves as a position paper and tutorial to those who are users of SLAM. By
looking at the published research with a critical eye, we delineate open
challenges and new research issues, that still deserve careful scientific
investigation. The paper also contains the authors' take on two questions that
often animate discussions during robotics conferences: Do robots need SLAM? and
Is SLAM solved
- …