11,482 research outputs found
Key challenges in agent-based modelling for geo-spatial simulation
Agent-based modelling (ABM) is fast becoming the dominant paradigm in social simulation due primarily to a worldview that suggests that complex systems emerge from the bottom-up, are highly decentralised, and are composed of a multitude of heterogeneous objects called agents. These agents act with some purpose and their interaction, usually through time and space, generates emergent order, often at higher levels than those at which such agents operate. ABM however raises as many challenges as it seeks to resolve. It is the purpose of this paper to catalogue these challenges and to illustrate them using three somewhat different agent-based models applied to city systems. The seven challenges we pose involve: the purpose for which the model is built, the extent to which the model is rooted in independent theory, the extent to which the model can be replicated, the ways the model might be verified, calibrated and validated, the way model dynamics are represented in terms of agent interactions, the extent to which the model is operational, and the way the model can be communicated and shared with others. Once catalogued, we then illustrate these challenges with a pedestrian model for emergency evacuation in central London, a hypothetical model of residential segregation tuned to London data which elaborates the standard Schelling (1971) model, and an agent-based residential location built according to spatial interactions principles, calibrated to trip data for Greater London. The ambiguities posed by this new style of modelling are drawn out as conclusions
First measurement of low intensity fast neutron background from rock at the Boulby Underground Laboratory
A technique to measure low intensity fast neutron flux has been developed.
The design, calibrations, procedure for data analysis and interpretation of the
results are discussed in detail. The technique has been applied to measure the
neutron background from rock at the Boulby Underground Laboratory, a site used
for dark matter and other experiments, requiring shielding from cosmic ray
muons. The experiment was performed using a liquid scintillation detector. A
6.1 litre volume stainless steel cell was filled with an in-house made liquid
scintillator loaded with Gd to enhance neutron capture. A two-pulse signature
(proton recoils followed by gammas from neutron capture) was used to identify
the neutron events from much larger gamma background from PMTs. Suppression of
gammas from the rock was achieved by surrounding the detector with high-purity
lead and copper. Calibrations of the detector were performed with various gamma
and neutron sources. Special care was taken to eliminate PMT afterpulses and
correlated background events from the delayed coincidences of two pulses in the
Bi-Po decay chain. A four month run revealed a neutron-induced event rate of
1.84 +- 0.65 (stat.) events/day. Monte Carlo simulations based on the GEANT4
toolkit were carried out to estimate the efficiency of the detector and the
energy spectra of the expected proton recoils. From comparison of the measured
rate with Monte Carlo simulations the flux of fast neutrons from rock was
estimated as (1.72 +- 0.61 (stat.) +- 0.38 (syst.))*10^(-6) cm^(-2) s^(-1)
above 0.5 MeV.Comment: 37 pages, 24 figures, to be published in Astroparticle Physic
Guidelines for assessing pedestrian evacuation software applications
This paper serves to clearly identify and explain criteria to consider when evaluating the
suitability of a pedestrian evacuation software application to assess the evacuation
process of a building. Guidelines in the form of nine topic areas identify different
modelling approaches adopted, as well as features / functionality provided by
applications designed specifically for simulating the egress of pedestrians from inside a
building. The paper concludes with a synopsis of these guidelines, identifying key
questions (by topic area) to found an evaluation
Quenching Factor for Low Energy Nuclear Recoils in a Plastic Scintillator
Plastic scintillators are widely used in industry, medicine and scientific
research, including nuclear and particle physics. Although one of their most
common applications is in neutron detection, experimental data on their
response to low-energy nuclear recoils are scarce. Here, the relative
scintillation efficiency for neutron-induced nuclear recoils in a
polystyrene-based plastic scintillator (UPS-923A) is presented, exploring
recoil energies between 125 keV and 850 keV. Monte Carlo simulations,
incorporating light collection efficiency and energy resolution effects, are
used to generate neutron scattering spectra which are matched to observed
distributions of scintillation signals to parameterise the energy-dependent
quenching factor. At energies above 300 keV the dependence is reasonably
described using the semi-empirical formulation of Birks and a kB factor of
(0.014+/-0.002) g/MeVcm^2 has been determined. Below that energy the measured
quenching factor falls more steeply than predicted by the Birks formalism.Comment: 8 pages, 9 figure
Recommended from our members
Radiogenic backgrounds in the NEXT double beta decay experiment
Natural radioactivity represents one of the main backgrounds in the search for neutrinoless double beta decay. Within the NEXT physics program, the radioactivity- induced backgrounds are measured with the NEXT-White detector. Data from 37.9 days of low-background operations at the Laboratorio Subterráneo de Canfranc with xenon depleted in 136Xe are analyzed to derive a total background rate of (0.84±0.02) mHz above 1000 keV. The comparison of data samples with and without the use of the radon abatement system demonstrates that the contribution of airborne-Rn is negligible. A radiogenic background model is built upon the extensive radiopurity screening campaign conducted by the NEXT collaboration. A spectral fit to this model yields the specific contributions of 60Co, 40K, 214Bi and 208Tl to the total background rate, as well as their location in the detector volumes. The results are used to evaluate the impact of the radiogenic backgrounds in the double beta decay analyses, after the application of topological cuts that reduce the total rate to (0.25±0.01) mHz. Based on the best-fit background model, the NEXT-White median sensitivity to the two-neutrino double beta decay is found to be 3.5σ after 1 year of data taking. The background measurement in a Qββ±100 keV energy window validates the best-fit background model also for the neutrinoless double beta decay search with NEXT-100. Only one event is found, while the model expectation is (0.75±0.12) events. [Figure not available: see fulltext.]
Latest Results from the Heidelberg-Moscow Double Beta Decay Experiment
New results for the double beta decay of 76Ge are presented. They are
extracted from Data obtained with the HEIDELBERG-MOSCOW, which operates five
enriched 76Ge detectors in an extreme low-level environment in the GRAN SASSO.
The two neutrino accompanied double beta decay is evaluated for the first time
for all five detectors with a statistical significance of 47.7 kg y resulting
in a half life of (T_(1/2))^(2nu) = [1.55 +- 0.01 (stat) (+0.19) (-0.15)
(syst)] x 10^(21) years. The lower limit on the half-life of the 0nu beta-beta
decay obtained with pulse shape analysis is (T_(1/2))^(0_nu) > 1.9 x 10^(25)
[3.1 x 10^(25)] years with 90% C.L. (68% C.L.) (with 35.5 kg y). This results
in an upper limit of the effective Majorana neutrino mass of 0.35 eV (0.27 eV).
No evidence for a Majoron emitting decay mode or for the neutrinoless mode is
observed.Comment: 14 pages, revtex, 6 figures, Talk was presented at third
International Conference ' Dark Matter in Astro and Particle Physics' -
DARK2000, to be publ. in Proc. of DARK2000, Springer (2000). Please look into
our HEIDELBERG Non-Accelerator Particle Physics group home page:
http://www.mpi-hd.mpg.de/non_acc
Overview on agent-based social modelling and the use of formal languages
Transdisciplinary Models and Applications investigates a variety of programming languages used in validating and verifying models in order to assist in their eventual implementation. This book will explore different methods of evaluating and formalizing simulation models, enabling computer and industrial engineers, mathematicians, and students working with computer simulations to thoroughly understand the progression from simulation to product, improving the overall effectiveness of modeling systems.Postprint (author's final draft
Agent Street: An Environment for Exploring Agent-Based Models in Second Life
Urban models can be seen on a continuum between iconic and symbolic. Generally speaking, iconic models are physical versions of the real world at some scaled down representation, while symbolic models represent the system in terms of the way they function replacing the physical or material system by some logical and/or mathematical formulae. Traditionally iconic and symbolic models were distinct classes of model but due to the rise of digital computing the distinction between the two is becoming blurred, with symbolic models being embedded into iconic models. However, such models tend to be single user. This paper demonstrates how 3D symbolic models in the form of agent-based simulations can be embedded into iconic models using the multi-user virtual world of Second Life. Furthermore, the paper demonstrates Second Life\'s potential for social science simulation. To demonstrate this, we first introduce Second Life and provide two exemplar models; Conway\'s Game of Life, and Schelling\'s Segregation Model which highlight how symbolic models can be viewed in an iconic environment. We then present a simple pedestrian evacuation model which merges the iconic and symbolic together and extends the model to directly incorporate avatars and agents in the same environment illustrating how \'real\' participants can influence simulation outcomes. Such examples demonstrate the potential for creating highly visual, immersive, interactive agent-based models for social scientists in multi-user real time virtual worlds. The paper concludes with some final comments on problems with representing models in current virtual worlds and future avenues of research.Agent-Based Modelling, Pedestrian Evacuation, Segregation, Virtual Worlds, Second Life
- …