4,506 research outputs found
Analytical QCD and multiparticle production
We review the perturbative approach to multiparticle production in hard
collision processes. It is investigated to what extent parton level analytical
calculations at low momentum cut-off can reproduce experimental data on the
hadronic final state. Systematic results are available for various observables
with the next-to-leading logarithmic accuracy (the so-called modified leading
logarithmic approximation - MLLA). We introduce the analytical formalism and
then discuss recent applications concerning multiplicities, inclusive spectra,
correlations and angular flows in multi-jet events. In various cases the
perturbative picture is surprisingly successful, even for very soft particle
production.Comment: 97 pages, LaTeX, 22 figures, uses sprocl.sty (included
QCD Constituent Counting Rules for Neutral Vector Mesons
QCD constituent counting rules define the scaling behavior of exclusive
hadronic scattering and electromagnetic scattering amplitudes at high momentum
transfer in terms of the total number of fundamental constituents in the
initial and final states participating in the hard subprocess. The scaling laws
reflect the twist of the leading Fock state for each hadron and hence the
leading operator that creates the composite state from the vacuum. Thus, the
constituent counting scaling laws can be used to identify the twist of exotic
hadronic candidates such as tetraquarks and pentaquarks. Effective field
theories must consistently implement the scaling rules in order to be
consistent with the fundamental theory. Here we examine how one can apply
constituent counting rules for the exclusive production of one or two neutral
vector mesons in annihilation, processes in which the can
couple via intermediate photons. In case of a (narrow) real , the photon
virtuality is fixed to a precise value , in effect treating
the as a single fundamental particle. Each real thus contributes to
the constituent counting rules with . In effect, the leading
operator underlying the has twist 1. Thus, in the specific physical case
of single or double on-shell production via intermediate photons, the
predicted scaling from counting rules coincides with Vector Meson Dominance
(VMD), an effective theory that treats as an elementary field. However,
the VMD prediction fails in the general case where the is not coupled
through an elementary photon field, and then the leading-twist interpolating
operator has twist . Analogous effects appear in scattering
processes.Comment: 15 page
QCD Compositeness as Revealed in Exclusive Vector Boson Reactions through Double-Photon Annihilation: and
We study the exclusive double-photon annihilation processes, and where the is a neutral vector meson produced in the
forward kinematical region: and . We
show how the differential cross sections , as predicted by
QCD, have additional falloff in the momentum transfer squared due to the
QCD compositeness of the hadrons, consistent with the leading-twist
fixed- scaling laws. However, even though they are exclusive
channels and not associated with the conventional electron-positron
annihilation process these total cross
sections and integrated over the dominant forward- and backward-
angular domains, scale as , and thus contribute to the leading-twist
scaling behavior of the ratio . We generalize these results to
exclusive double-electroweak vector-boson annihilation processes accompanied by
the forward production of hadrons, such as and . These results can also be applied to the exclusive production
of exotic hadrons such as tetraquarks, where the cross-section scaling behavior
can reveal their multiquark nature.Comment: 10 page
Models of neutron star atmospheres enriched with nuclear burning ashes
Low-mass X-ray binaries hosting neutron stars (NS) exhibit thermonuclear
(type-I) X-ray bursts, which are powered by unstable nuclear burning of helium
and/or hydrogen into heavier elements deep in the NS "ocean". In some cases the
burning ashes may rise from the burning depths up to the NS photosphere by
convection, leading to the appearance of the metal absorption edges in the
spectra, which then force the emergent X-ray burst spectra to shift toward
lower energies. These effects may have a substantial impact on the color
correction factor and the dilution factor , the parameters of the
diluted blackbody model that is commonly used
to describe the emergent spectra from NSs. The aim of this paper is to quantify
how much the metal enrichment can change these factors. We have developed a new
NS atmosphere modeling code, which has a few important improvements compared to
our previous code required by inclusion of the metals. The opacities and the
internal partition functions (used in the ionization fraction calculations) are
now taken into account for all atomic species. In addition, the code is now
parallelized to counter the increased computational load. We compute a detailed
grid of atmosphere models with different exotic chemical compositions that
mimic the presence of the burning ashes. From the emerging model spectra we
compute the color correction factors and the dilution factors that
can then be compared to the observations. We find that the metals may change
by up to about 40%, which is enough to explain the scatter seen in the
blackbody radius measurements. The presented models open up the possibility for
determining NS mass and radii more accurately, and may also act as a tool to
probe the nuclear burning mechanisms of X-ray bursts.Comment: 14 pages, 7 figures, to be published in A&
Lepton flavor violating decays of vector mesons
We estimate the rates of lepton flavor violating decays of the vector mesons
. The theoretical tools are based on an effective
Lagrangian approach without referring to any specific realization of the
physics beyond the standard model responsible for lepton flavor violation
(\Lfv). The effective lepton-vector meson couplings are extracted from the
existing experimental bounds on the nuclear conversion. In
particular, we derive an upper limit for the \Lfv branching ratio which is much more stringent than
the recent experimental result
presented by the SND Collaboration. Very tiny limits on \Lfv decays of vector
mesons derived in this letter make direct experimental observation of these
processes unrealistic.Comment: 3 pages, 1 figure, accepted for publication in Phys. Rev.
BiofilmQuant: A Computer-Assisted Tool for Dental Biofilm Quantification
Dental biofilm is the deposition of microbial material over a tooth
substratum. Several methods have recently been reported in the literature for
biofilm quantification; however, at best they provide a barely automated
solution requiring significant input needed from the human expert. On the
contrary, state-of-the-art automatic biofilm methods fail to make their way
into clinical practice because of the lack of effective mechanism to
incorporate human input to handle praxis or misclassified regions. Manual
delineation, the current gold standard, is time consuming and subject to expert
bias. In this paper, we introduce a new semi-automated software tool,
BiofilmQuant, for dental biofilm quantification in quantitative light-induced
fluorescence (QLF) images. The software uses a robust statistical modeling
approach to automatically segment the QLF image into three classes (background,
biofilm, and tooth substratum) based on the training data. This initial
segmentation has shown a high degree of consistency and precision on more than
200 test QLF dental scans. Further, the proposed software provides the
clinicians full control to fix any misclassified areas using a single click. In
addition, BiofilmQuant also provides a complete solution for the longitudinal
quantitative analysis of biofilm of the full set of teeth, providing greater
ease of usability.Comment: 4 pages, 4 figures, 36th Annual International Conference of the IEEE
Engineering in Medicine and Biology Society (EMBC 2014
A Statistical Modeling Approach to Computer-Aided Quantification of Dental Biofilm
Biofilm is a formation of microbial material on tooth substrata. Several
methods to quantify dental biofilm coverage have recently been reported in the
literature, but at best they provide a semi-automated approach to
quantification with significant input from a human grader that comes with the
graders bias of what are foreground, background, biofilm, and tooth.
Additionally, human assessment indices limit the resolution of the
quantification scale; most commercial scales use five levels of quantification
for biofilm coverage (0%, 25%, 50%, 75%, and 100%). On the other hand, current
state-of-the-art techniques in automatic plaque quantification fail to make
their way into practical applications owing to their inability to incorporate
human input to handle misclassifications. This paper proposes a new interactive
method for biofilm quantification in Quantitative light-induced fluorescence
(QLF) images of canine teeth that is independent of the perceptual bias of the
grader. The method partitions a QLF image into segments of uniform texture and
intensity called superpixels; every superpixel is statistically modeled as a
realization of a single 2D Gaussian Markov random field (GMRF) whose parameters
are estimated; the superpixel is then assigned to one of three classes
(background, biofilm, tooth substratum) based on the training set of data. The
quantification results show a high degree of consistency and precision. At the
same time, the proposed method gives pathologists full control to post-process
the automatic quantification by flipping misclassified superpixels to a
different state (background, tooth, biofilm) with a single click, providing
greater usability than simply marking the boundaries of biofilm and tooth as
done by current state-of-the-art methods.Comment: 10 pages, 7 figures, Journal of Biomedical and Health Informatics
2014. keywords: {Biomedical imaging;Calibration;Dentistry;Estimation;Image
segmentation;Manuals;Teeth},
http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6758338&isnumber=636350
- …