924 research outputs found
Recent developments in GEANT 4
Fil: Depaola, Gerardo Osvaldo. Universidad Nacional de Córdoba. Facultad de Matemática, Astronomía, Física y Computación; Argentina.GEANT4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Over the past several years, major changes have been made to the toolkit in order to accommodate the needs of these user communities, and to efficiently exploit the growth of computing power made available by advances in technology. The adaptation of GEANT4 to multithreading, advances in physics, detector modeling and visualization, extensions to the toolkit, including biasing and reverse Monte Carlo, and tools for physics and release validation are discussed here.info:eu-repo/semantics/publishedVersionFil: Depaola, Gerardo Osvaldo. Universidad Nacional de Córdoba. Facultad de Matemática, Astronomía, Física y Computación; Argentina.Física de Partículas y Campo
The recent upgrades in the "standard" electromagnetic physics package
The current status and the recent developments of Geant4 "Standard" electromagnetic package are presented. The design iteration of the package carried out for the last two years is completed. It provides model versus process structure of the code. The internal database of elements and materials based on the NIST databases is introduced inside the Geant4 toolkit as well. The focus of recent activities is on upgrade of physics models and on validation of simulation results. The significant revisions were done for ionistion models, for models for transition radiation, and multiple scattering models, which are presented in this work. The acceptance suite evolution is also discussed
Universality Class of Models
We point out that existing numerical data on the correlation length and
magnetic susceptibility suggest that the two dimensional model with
standard action has critical exponent , which is inconsistent with
asymptotic freedom. This value of is also different from the one of the
Wess-Zumino-Novikov-Witten model that is supposed to correspond to the
model at .Comment: 8 pages, with 3 figures included, postscript. An error concerning the
errors has been correcte
Postoperative peri-axillary seroma following axillary artery cannulation for surgical treatment of acute type A aortic dissection
The arterial cannulation site for optimal tissue perfusion and cerebral protection during cardiopulmonary bypass (CPB) for surgical treatment of acute type A aortic dissection remains controversial. Right axillary artery cannulation confers significant advantages, because it provides antegrade arterial perfusion during cardiopulmonary bypass, and allows continuous antegrade cerebral perfusion during hypothermic circulatory arrest, thereby minimizing global cerebral ischemia. However, right axillary artery cannulation has been associated with serious complications, including problems with systemic perfusion during cardiopulmonary bypass, problems with postoperative patency of the artery due to stenosis, thrombosis or dissection, and brachial plexus injury. We herein present the case of a 36-year-old Caucasian man with known Marfan syndrome and acute type A aortic dissection, who had direct right axillary artery cannulation for surgery of the ascending aorta. Postoperatively, the patient developed an axillary perigraft seroma. As this complication has, not, to our knowledge, been reported before in cardiothoracic surgery, we describe this unusual complication and discuss conservative and surgical treatment options
Percolation properties of the 2D Heisenberg model
We analyze the percolation properties of certain clusters defined on
configurations of the 2--dimensional Heisenberg model. We find that, given any
direction \vec{n} in O(3) space, the spins almost perpendicular to \vec{n} form
a percolating cluster. This result gives indications of how the model can avoid
a previously conjectured Kosterlitz-Thouless phase transition at finite
temperature T.Comment: 4 pages, 3 eps figures. Revised version (more clear abstract, some
new references
Finite-size scaling of the helicity modulus of the two-dimensional O(3) model
Using Monte Carlo methods, we compute the finite-size scaling function of the
helicity modulus of the two-dimensional O(3) model and compare it to
the low temperature expansion prediction. From this, we estimate the range of
validity for the leading terms of the low temperature expansion of the
finite-size scaling function and for the low temperature expansion of the
correlation length. Our results strongly suggest that a Kosterlitz-Thouless
transition at a temperature is extremely unlikely in this model.Comment: 4 pages, 3 Postscript figures, to appear in Phys. Rev. B Jan. 1997 as
a Brief Repor
A test of Local Realism with entangled kaon pairs and without inequalities
We propose the use of entangled pairs of neutral kaons, considered as a
promising tool to close the well known loopholes affecting generic Bell's
inequality tests, in a specific Hardy-type experiment. Hardy's contradiction
without inequalities between Local Realism and Quantum Mechanics can be
translated into a feasible experiment by requiring ideal detection efficiencies
for only one of the observables to be alternatively measured. Neutral kaons are
near to fulfil this requirement and therefore to close the efficiency loophole.Comment: 4 RevTeX page
Studying Indirect Violation of CP, T and CPT in a B-factory
In this work we analyze the observable asymmetries one can build from
entangled B-meson states, in order to extract information on the parameters
epsilon and delta which govern indirect violation of discrete symmetries. The
traditionally proposed observables, based on flavour tags, are not helpful for
the study of the Bd-system, where the tiny value of the width difference
between physical states clears up such asymmetry effects. Our study makes
instead use of CP tags in order to build new asymmetries where the different
parameters can be separated out. For this separation, it is decisive to achieve
a good time resolution in the measurement of entangled state decays.
Nevertheless, even with no temporal information, as would be the case in a
symmetric factory, it is still possible to extract some information on the
symmetries of the system. We discuss both genuine and non-genuine observables,
depending on whether absorptive parts can mimic or not asymmetry effects.Comment: 18 pages, to appear in Nucl. Phys B; some minor corrections inluded,
additional discussion added to some sections, references complete
Recommended from our members
Recommendations for probabilistic seismic hazard analysis: Guidance on uncertainty and use of experts
Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time period. Due to large uncertainties in all the geosciences data and in their modeling, multiple model interpretations are often possible. This leads to disagreement among experts, which in the past has led to disagreement on the selection of ground motion for design at a given site. In order to review the present state-of-the-art and improve on the overall stability of the PSHA process, the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), and the Electric Power Research Institute (EPRI) co-sponsored a project to provide methodological guidance on how to perform a PSHA. The project has been carried out by a seven-member Senior Seismic Hazard Analysis Committee (SSHAC) supported by a large number other experts. The SSHAC reviewed past studies, including the Lawrence Livermore National Laboratory and the EPRI landmark PSHA studies of the 1980`s and examined ways to improve on the present state-of-the-art. The Committee`s most important conclusion is that differences in PSHA results are due to procedural rather than technical differences. Thus, in addition to providing a detailed documentation on state-of-the-art elements of a PSHA, this report provides a series of procedural recommendations. The role of experts is analyzed in detail. Two entities are formally defined-the Technical Integrator (TI) and the Technical Facilitator Integrator (TFI)--to account for the various levels of complexity in the technical issues and different levels of efforts needed in a given study
Intrinsic CPT violation and decoherence for entangled neutral mesons
We present a combined treatment of quantum-gravity-induced effects and
intrinsic CPT violation in entangled neutral-Kaon states. Our analysis takes
into consideration two types of effects: first, those associated with the loss
of particle-antiparticle identity, as a result of the ill-defined nature of the
CPT operator, and second, effects due to the non-unitary evolution of the Kaons
in the space-time foam. By studying a variety of phi-factory observables,
involving identical as well as general final states, we derive analytical
expressions, to leading order in the associated CPT violating parameters, for
double-decay rates and their time-integrated counterparts. Our analysis shows
that the various types of the aforementioned effects may be disentangled
through judicious combinations of appropriate observables in a phi factory.Comment: 31 pages revtex, nine eps figures incorporated, Journal version
(typos and latex errors corrected, discussion augmented, no changes in
conclussions
- …