267,479 research outputs found
Recommended from our members
A survey of simulation techniques in commerce and defence
Despite the developments in Modelling and Simulation (M&S) tools and techniques over the past years, there has been a gap in the M&S research and practice in healthcare on developing a toolkit to assist the modellers and simulation practitioners with selecting an appropriate set of techniques. This study is a preliminary step towards this goal. This paper presents some results from a systematic literature survey on applications of M&S in the commerce and defence domains that could inspire some improvements in the healthcare. Interim results show that in the commercial sector Discrete-Event Simulation (DES) has been the most widely used technique with System Dynamics (SD) in second place. However in the defence sector, SD has gained relatively more attention. SD has been found quite useful for qualitative and soft factors analysis. From both the surveys it becomes clear that there is a growing trend towards using hybrid M&S approaches
Laser Wire Scanner Compton Scattering Techniques for the Measurement of the Transverse Beam Size of Particle Beams at Future Linear Colliders
This archive summarizes a working paper and conference proceedings related to
laser wire scanner development for the Future Linear Collider (FLC) in the
years 2001 to 2006. In particular the design, setup and data taking for the
laser wire experiments at PETRA II and CT2 are described. The material is
focused on the activities undertaken by Royal Holloway University of London
(RHUL).Comment: 61 page
Using Quantum Computers for Quantum Simulation
Numerical simulation of quantum systems is crucial to further our
understanding of natural phenomena. Many systems of key interest and
importance, in areas such as superconducting materials and quantum chemistry,
are thought to be described by models which we cannot solve with sufficient
accuracy, neither analytically nor numerically with classical computers. Using
a quantum computer to simulate such quantum systems has been viewed as a key
application of quantum computation from the very beginning of the field in the
1980s. Moreover, useful results beyond the reach of classical computation are
expected to be accessible with fewer than a hundred qubits, making quantum
simulation potentially one of the earliest practical applications of quantum
computers. In this paper we survey the theoretical and experimental development
of quantum simulation using quantum computers, from the first ideas to the
intense research efforts currently underway.Comment: 43 pages, 136 references, review article, v2 major revisions in
response to referee comments, v3 significant revisions, identical to
published version apart from format, ArXiv version has table of contents and
references in alphabetical orde
Improved energy resolution for VHE gamma-ray astronomy with systems of Cherenkov telescopes
We present analysis techniques to improve the energy resolution of
stereoscopic systems of imaging atmospheric Cherenkov telescopes, using the
HEGRA telescope system as an example. The techniques include (i) the
determination of the height of the shower maximum, which is then taken into
account in the energy determination, and (ii) the determination of the location
of the shower core with the additional constraint that the direction of the
gamma rays is known a priori. This constraint can be applied for gamma-ray
point sources, and results in a significant improvement in the localization of
the shower core, which translates into better energy resolution. Combining both
techniques, the HEGRA telescopes reach an energy resolution between 9% and 12%,
over the entire energy range from 1 TeV to almost 100 TeV. Options for further
improvements of the energy resolution are discussed.Comment: 13 Pages, 7 figures, Latex. Astroparticle Physics, in pres
Towards a Mini-App for Smoothed Particle Hydrodynamics at Exascale
The smoothed particle hydrodynamics (SPH) technique is a purely Lagrangian
method, used in numerical simulations of fluids in astrophysics and
computational fluid dynamics, among many other fields. SPH simulations with
detailed physics represent computationally-demanding calculations. The
parallelization of SPH codes is not trivial due to the absence of a structured
grid. Additionally, the performance of the SPH codes can be, in general,
adversely impacted by several factors, such as multiple time-stepping,
long-range interactions, and/or boundary conditions. This work presents
insights into the current performance and functionalities of three SPH codes:
SPHYNX, ChaNGa, and SPH-flow. These codes are the starting point of an
interdisciplinary co-design project, SPH-EXA, for the development of an
Exascale-ready SPH mini-app. To gain such insights, a rotating square patch
test was implemented as a common test simulation for the three SPH codes and
analyzed on two modern HPC systems. Furthermore, to stress the differences with
the codes stemming from the astrophysics community (SPHYNX and ChaNGa), an
additional test case, the Evrard collapse, has also been carried out. This work
extrapolates the common basic SPH features in the three codes for the purpose
of consolidating them into a pure-SPH, Exascale-ready, optimized, mini-app.
Moreover, the outcome of this serves as direct feedback to the parent codes, to
improve their performance and overall scalability.Comment: 18 pages, 4 figures, 5 tables, 2018 IEEE International Conference on
Cluster Computing proceedings for WRAp1
Engineering simulations for cancer systems biology
Computer simulation can be used to inform in vivo and in vitro experimentation, enabling rapid, low-cost hypothesis generation and directing experimental design in order to test those hypotheses. In this way, in silico models become a scientific instrument for investigation, and so should be developed to high standards, be carefully calibrated and their findings presented in such that they may be reproduced. Here, we outline a framework that supports developing simulations as scientific instruments, and we select cancer systems biology as an exemplar domain, with a particular focus on cellular signalling models. We consider the challenges of lack of data, incomplete knowledge and modelling in the context of a rapidly changing knowledge base. Our framework comprises a process to clearly separate scientific and engineering concerns in model and simulation development, and an argumentation approach to documenting models for rigorous way of recording assumptions and knowledge gaps. We propose interactive, dynamic visualisation tools to enable the biological community to interact with cellular signalling models directly for experimental design. There is a mismatch in scale between these cellular models and tissue structures that are affected by tumours, and bridging this gap requires substantial computational resource. We present concurrent programming as a technology to link scales without losing important details through model simplification. We discuss the value of combining this technology, interactive visualisation, argumentation and model separation to support development of multi-scale models that represent biologically plausible cells arranged in biologically plausible structures that model cell behaviour, interactions and response to therapeutic interventions
Concept of a novel fast neutron imaging detector based on THGEM for fan-beam tomography applications
The conceptual design and operational principle of a novel high-efficiency,
fast neutron imaging detector based on THGEM, intended for future fan-beam
transmission tomography applications, is described. We report on a feasibility
study based on theoretical modeling and computer simulations of a possible
detector configuration prototype. In particular we discuss results regarding
the optimization of detector geometry, estimation of its general performance,
and expected imaging quality: it has been estimated that detection efficiency
of around 5-8% can be achieved for 2.5MeV neutrons; spatial resolution is
around one millimeter with no substantial degradation due to scattering
effects. The foreseen applications of the imaging system are neutron tomography
in non-destructive testing for the nuclear energy industry, including
examination of spent nuclear fuel bundles, detection of explosives or drugs, as
well as investigation of thermal hydraulics phenomena (e.g., two-phase flow,
heat transfer, phase change, coolant dynamics, and liquid metal flow).Comment: 11 Pages; 6 Figures; Proceeding of the International Workshop on Fast
Neutron Detectors and Application FNDA2011, Ein Gedi, Israel, November 2011.
Published on the Journal of Instrumentation; 2012 JINST 7 C0205
Modelling the Galaxy in the era of Gaia
The body of photometric and astrometric data on stars in the Galaxy has been
growing very fast in recent years (Hipparcos/Tycho, OGLE-3, 2-Mass, DENIS,
UCAC2, SDSS, RAVE, Pan Starrs, Hermes, ...) and in two years ESA will launch
the Gaia satellite, which will measure astrometric data of unprecedented
precision for a billion stars. On account of our position within the Galaxy and
the complex observational biases that are built into most catalogues, dynamical
models of the Galaxy are a prerequisite full exploitation of these catalogues.
On account of the enormous detail in which we can observe the Galaxy, models of
great sophistication are required. Moreover, in addition to models we require
algorithms for observing them with the same errors and biases as occur in real
observational programs, and statistical algorithms for determining the extent
to which a model is compatible with a given body of data.
JD5 reviewed the status of our knowledge of the Galaxy, the different ways in
which we could model the Galaxy, and what will be required to extract our
science goals from the data that will be on hand when the Gaia Catalogue
becomes available.Comment: Proceedings of Joint Discussion 5 at IAU XXVII, Rio de Janeiro,
August 2009; 31 page
Increasing resilience of ATM networks using traffic monitoring and automated anomaly analysis
Systematic network monitoring can be the cornerstone for
the dependable operation of safety-critical distributed
systems. In this paper, we present our vision for informed
anomaly detection through network monitoring and
resilience measurements to increase the operators'
visibility of ATM communication networks. We raise the
question of how to determine the optimal level of
automation in this safety-critical context, and we present a
novel passive network monitoring system that can reveal
network utilisation trends and traffic patterns in diverse
timescales. Using network measurements, we derive
resilience metrics and visualisations to enhance the
operators' knowledge of the network and traffic behaviour,
and allow for network planning and provisioning based on
informed what-if analysis
- âŠ