1,401 research outputs found
Validation of Geant4 nuclear reaction models for hadrontherapy and preliminary results with SMF and BLOB
Reliable nuclear fragmentation models are of utmost importance in hadrontherapy, where Monte Carlo (MC) simulations are used to compute the input parameters of the treatment planning software, to validate the deposited dose calculation, to evaluate the biological effectiveness of the radiation, to correlate the bþ emitters production in the patient body with the delivered dose, and to allow a non- invasive treatment verification.
Despite of its large use, the models implemented in Geant4 have shown severe limitations in reproducing the measured secondaries yields in ions interaction below 100 MeV/A, in term of production rates, angular and energy distributions [1–3]. We will present a benchmark of the Geant4 models with double-differential cross sec- tion and angular distributions of the secondary fragments produced in the 12C fragmentation at 62 MeV/A on thin carbon target, such a benchmark includes the recently implemented model INCL++ [4,5]. Moreover, we will present the preliminary results, obtained in simulating the same interaction, with SMF [6] and BLOB [7]. Both, SMF and BLOB are semiclassical one-body approaches to solve the Boltzmann-Langevin equation. They include an identical treatment of the mean-field propagation, on the basis of the same effective interaction, but they differ in the way fluctuations are included.
In particular, while SMF employs a Uehling-Uhlenbeck collision term and introduces fluctuations as projected on the density space, BLOB introduces fluctuations in full phase space through a modified collision term where nucleon-nucleon correlations are explicitly involved. Both of them, SMF and BLOB, have been developed to sim- ulate the heavy ion interactions in the Fermi-energy regime. We will show their capabilities in describing 12C fragmentation foreseen their implementation in Geant4
Circular local likelihood
We introduce a class of local likelihood circular density estimators, which includes the kernel density estimator as a special case. The idea lies in optimizing a spatially weighted version of the log-likelihood function, where the logarithm of the density is locally approximated by a periodic polynomial. The use of von Mises density functions as weights reduces the computational burden. Also, we propose closed-form estimators which could form the basis of counterparts in the multidimensional Euclidean setting. Simulation results and a real data case study are used to evaluate the performance and illustrate the results
A note on nonparametric estimation of circular conditional densities
The conditional density offers the most informative summary of the relationship between explanatory and response variables. We need to estimate it in place of the simple conditional mean when its shape is not well-behaved. A motivation for estimating conditional densities, specific to the circular setting, lies in the fact that a natural alternative of it, like quantile regression, could be considered problematic because circular quantiles are not rotationally equivariant. We treat conditional density estimation as a local polynomial fitting problem as proposed by \cite{Fan et al.:1996} in the euclidean setting, and discuss a class of estimators in the cases when the conditioning variable is either circular or linear. Asymptotic properties for some members of the proposed class are derived. The effectiveness of the methods for finite sample sizes is illustrated by simulation experiments and an example using real data
Nonparametric estimating equations for circular probability density functions and their derivatives
We propose estimating equations whose unknown parameters are the values taken by a circular density and its derivatives at a point. Specifically, we solve equations which relate local versions of population trigonometric moments with their sample counterparts. Major advantages of our approach are: higher order bias without asymptotic variance inflation, closed form for the estimators, and absence of numerical tasks. We also investigate situations where the observed data are dependent. Theoretical results along with simulation experiments are provided
Poder y contabilidad: Guglielmo Gonzaga y Angelo Pietra (1586-87)
El proceso judicial promovido en Mantua contra el prelado Camillo Luzzara y el
romano Bernardino Pia en 1586 vendrÃa a poner de manifiesto el malestar existente en la corte
ducal contra el gobierno del tercer duque, Guglielmo Gonzaga. Uno de los factores que habrÃa
contribuido al mismo serÃa la revisión de los procedimientos contables encargada por el duque al
monje genovés Angelo Pietra, reforma que formaba parte de un plan destinado a sanear las finanzas
ducales y a garantizar el control de las finanzas cortesanas. Este artÃculo analiza las caracterÃsticas
del nuevo sistema basado en la doble contabilidad y considera al mismo como una manifestación
destacada del proceso de concentración del poder impulsado por Guglielmo Gonzaga
A schlieren method for ultra-low angle light scattering measurements
We describe a self calibrating optical technique that allows to perform
absolute measurements of scattering cross sections for the light scattered at
extremely small angles. Very good performances are obtained by using a very
simple optical layout similar to that used for the schlieren method, a
technique traditionally used for mapping local refraction index changes. The
scattered intensity distribution is recovered by a statistical analysis of the
random interference of the light scattered in a half-plane of the scattering
wave vectors and the main transmitted beam. High quality data can be obtained
by proper statistical accumulation of scattered intensity frames, and the
static stray light contributions can be eliminated rigorously. The
potentialities of the method are tested in a scattering experiment from non
equilibrium fluctuations during a free diffusion experiment. Contributions of
light scattered from length scales as long as Lambda=1 mm can be accurately
determined.Comment: 7 pages, 3 figure
Neutron-induced Fission Cross Section of240,242Pu
A sensitivity analysis for the new generation of fast reactors [Salvatores (2008)] has shown the importance
of improved cross section data for several actinides. Among them, the240,242Pu(n,f) cross sections require an
accuracy improvement to 1-3% and 3-5%, respectively, from the current level of 6% and 20%. At the Van de Graaff
facility of the Institute for Reference Materials and Measurements (JRC-IRMM) the fission cross section of the two
isotopes was measured relative to two secondary standard reactions,237Np(n,f) and238U(n,f), using a twin Frisch-grid
ionization chamber. The secondary standard reactions were benchmarked through measurements against the primary
standard reaction235U(n,f) in the same geometry. Sample masses were determined by means of low-geometry alpha
counting or/and a 2p Frisch-grid ionization chamber, with an uncertainty lower than 2%. The neutron flux and the
impact of scattering from material between source and target was examined, the largest effect having been found
in cross section ratio measurements between a fissile and a fertile isotope. Our240,242Pu(n,f) cross sections are in
agreement with previous experimental results and slightly lower than present evaluations. In case of the242Pu(n,f)
reaction no evidence for a resonance at En=1.1 MeV was found.Postprint (published version
- …