22,463 research outputs found
Physics-related epistemic uncertainties in proton depth dose simulation
A set of physics models and parameters pertaining to the simulation of proton
energy deposition in matter are evaluated in the energy range up to
approximately 65 MeV, based on their implementations in the Geant4 toolkit. The
analysis assesses several features of the models and the impact of their
associated epistemic uncertainties, i.e. uncertainties due to lack of
knowledge, on the simulation results. Possible systematic effects deriving from
uncertainties of this kind are highlighted; their relevance in relation to the
application environment and different experimental requirements are discussed,
with emphasis on the simulation of radiotherapy set-ups. By documenting
quantitatively the features of a wide set of simulation models and the related
intrinsic uncertainties affecting the simulation results, this analysis
provides guidance regarding the use of the concerned simulation tools in
experimental applications; it also provides indications for further
experimental measurements addressing the sources of such uncertainties.Comment: To be published in IEEE Trans. Nucl. Sc
Optimal quantization for the pricing of swing options
In this paper, we investigate a numerical algorithm for the pricing of swing
options, relying on the so-called optimal quantization method. The numerical
procedure is described in details and numerous simulations are provided to
assert its efficiency. In particular, we carry out a comparison with the
Longstaff-Schwartz algorithm.Comment: 27
Validation Test of Geant4 Simulation of Electron Backscattering
Backscattering is a sensitive probe of the accuracy of electron scattering
algorithms implemented in Monte Carlo codes. The capability of the Geant4
toolkit to describe realistically the fraction of electrons backscattered from
a target volume is extensively and quantitatively evaluated in comparison with
experimental data retrieved from the literature. The validation test covers the
energy range between approximately 100 eV and 20 MeV, and concerns a wide set
of target elements. Multiple and single electron scattering models implemented
in Geant4, as well as preassembled selections of physics models distributed
within Geant4, are analyzed with statistical methods. The evaluations concern
Geant4 versions from 9.1 to 10.1. Significant evolutions are observed over the
range of Geant4 versions, not always in the direction of better compatibility
with experiment. Goodness-of-fit tests complemented by categorical analysis
tests identify a configuration based on Geant4 Urban multiple scattering model
in Geant4 version 9.1 and a configuration based on single Coulomb scattering in
Geant4 10.0 as the physics options best reproducing experimental data above a
few tens of keV. At lower energies only single scattering demonstrates some
capability to reproduce data down to a few keV. Recommended preassembled
physics configurations appear incapable of describing electron backscattering
compatible with experiment. With the support of statistical methods, a
correlation is established between the validation of Geant4-based simulation of
backscattering and of energy deposition
A probabilistic numerical method for optimal multiple switching problem and application to investments in electricity generation
In this paper, we present a probabilistic numerical algorithm combining
dynamic programming, Monte Carlo simulations and local basis regressions to
solve non-stationary optimal multiple switching problems in infinite horizon.
We provide the rate of convergence of the method in terms of the time step used
to discretize the problem, of the size of the local hypercubes involved in the
regressions, and of the truncating time horizon. To make the method viable for
problems in high dimension and long time horizon, we extend a memory reduction
method to the general Euler scheme, so that, when performing the numerical
resolution, the storage of the Monte Carlo simulation paths is not needed.
Then, we apply this algorithm to a model of optimal investment in power plants.
This model takes into account electricity demand, cointegrated fuel prices,
carbon price and random outages of power plants. It computes the optimal level
of investment in each generation technology, considered as a whole, w.r.t. the
electricity spot price. This electricity price is itself built according to a
new extended structural model. In particular, it is a function of several
factors, among which the installed capacities. The evolution of the optimal
generation mix is illustrated on a realistic numerical problem in dimension
eight, i.e. with two different technologies and six random factors
ROOT - A C++ Framework for Petabyte Data Storage, Statistical Analysis and Visualization
ROOT is an object-oriented C++ framework conceived in the high-energy physics
(HEP) community, designed for storing and analyzing petabytes of data in an
efficient way. Any instance of a C++ class can be stored into a ROOT file in a
machine-independent compressed binary format. In ROOT the TTree object
container is optimized for statistical data analysis over very large data sets
by using vertical data storage techniques. These containers can span a large
number of files on local disks, the web, or a number of different shared file
systems. In order to analyze this data, the user can chose out of a wide set of
mathematical and statistical functions, including linear algebra classes,
numerical algorithms such as integration and minimization, and various methods
for performing regression analysis (fitting). In particular, ROOT offers
packages for complex data modeling and fitting, as well as multivariate
classification based on machine learning techniques. A central piece in these
analysis tools are the histogram classes which provide binning of one- and
multi-dimensional data. Results can be saved in high-quality graphical formats
like Postscript and PDF or in bitmap formats like JPG or GIF. The result can
also be stored into ROOT macros that allow a full recreation and rework of the
graphics. Users typically create their analysis macros step by step, making use
of the interactive C++ interpreter CINT, while running over small data samples.
Once the development is finished, they can run these macros at full compiled
speed over large data sets, using on-the-fly compilation, or by creating a
stand-alone batch program. Finally, if processing farms are available, the user
can reduce the execution time of intrinsically parallel tasks - e.g. data
mining in HEP - by using PROOF, which will take care of optimally distributing
the work over the available resources in a transparent way
- …