6,599 research outputs found
Why the xE distribution triggered by a leading particle does not measure the fragmentation function but does measure the ratio of the transverse momenta of the away-side jet to the trigger-side jet
Hard-scattering of point-like constituents (or partons) in p-p collisions was
discovered at the CERN-ISR in 1972 by measurements utilizing inclusive single
or pairs of hadrons with large transverse momentum (). It was generally
assumed, following Feynman, Field and Fox, as shown by data from the CERN-ISR
experiments, that the distribution of away side hadrons from a single
particle trigger [with ], corrected for of fragmentation would
be the same as that from a jet-trigger and follow the same fragmentation
function as observed in or DIS. PHENIX attempted to measure the
fragmentation function from the away side
distribution of charged particles triggered by a in p-p collisions at
RHIC and showed by explicit calculation that the distribution is actually
quite insensitive to the fragmentation function. Illustrations of the original
arguments and ISR results will be presented. Then the lack of sensitivity to
the fragmentation function will be explained, and an analytic formula for the
distribution given, in terms of incomplete Gamma functions, for the case
where the fragmentation function is exponential. The away-side distribution in
this formulation has the nice property that it both exhibits scaling and
is directly sensitive to the ratio of the away jet to that of
the trigger jet, , and thus can be used, for example, to measure
the relative energy loss of the two jets from a hard-scattering which escape
from the medium in A+A collisions. Comparisons of the analytical formula to
RHIC measurements will be presented, including data from STAR and PHENIX,
leading to some interesting conclusions.Comment: 6 pages, 5 figures, Proceedings of Poster Session, 19th International
Conference on Ultra-Relativistic Nucleus-Nucleus Collisions (Quark Matter
2006), November 14-20, 2006, Shanghai, P. R. Chin
Model of Centauro and strangelet production in heavy ion collisions
We discuss the phenomenological model of Centauro event production in
relativistic nucleus-nucleus collisions. This model makes quantitative
predictions for kinematic observables, baryon number and mass of the Centauro
fireball and its decay products. Centauros decay mainly to nucleons, strange
hyperons and possibly strangelets. Simulations of Centauro events for the
CASTOR detector in Pb-Pb collisions at LHC energies are performed. The
signatures of these events are discussed in detail.Comment: 19 pages, LaTeX+revtex4, 14 eps-figures and 3 table
A Next-to-Leading-Order Study of Dihadron Production
The production of pairs of hadrons in hadronic collisions is studied using a
next-to-leading-order Monte Carlo program based on the phase space slicing
technique. Up-to-date fragmentation functions based on fits to LEP data are
employed, together with several versions of current parton distribution
functions. Good agreement is found with data for the dihadron mass
distribution. A comparison is also made with data for the dihadron angular
distribution. The scale dependence of the predictions and the dependence on the
choices made for the fragmentation and parton distribution functions are also
presented. The good agreement between theory and experiment is contrasted to
the case for single production where significant deviations between
theory and experiment have been observed.Comment: 22 pages, 15 figures; 3 references added, one figure modified for
clarit
Results from RHIC with Implications for LHC
Results from the PHENIX experiment at RHIC in p-p and Au+Au collisions are
reviewed from the perspective of measurements in p-p collisions at the CERN-ISR
which serve as a basis for many of the techniques used. Issues such as J/Psi
suppression and hydrodynamical flow in A+A collisions require data from
LHC-Ions for an improved understanding. Suppression of high pT particles in
Au+Au collisions, first observed at RHIC, also has unresolved mysteries such as
the equality of the suppression of inclusive pi0 (from light quarks and gluons)
and direct-single electrons (from the decay of heavy quarks) in the transverse
momentum range 4< pT < 9 GeV/c. This disfavors a radiative explanation of
suppression and leads to a fundamental question of whether the Higgs boson
gives mass to fermions. Observation of an exponential distribution of direct
photons in central Au+Au collisions for 1< pT <2 GeV/c where hard-processes are
negligible and with no similar exponential distribution in p-p collisions
indicates thermal photon emission from the medium at RHIC, making PHENIX at the
moment ``the hottest experiment in Physics''.Comment: Invited lectures at the International School of Subnuclear Physics,
47th Course, "The most unexpected at LHC and the status of High Energy
Frontier'', Erice, Sicily, Italy, August 29-September 7. 2009. 32 pages, 22
figure
Probing jet properties via two particle correlation method
The formulae for calculating jet fragmentation momentum, , and conditional yield are discussed in
two particle correlation framework. Additional corrections are derived to
account for the limited detector acceptance and inefficiency, for cases when
the event mixing technique is used. The validity of our approach is confirmed
with Monte-carlo simulation.Comment: Proceeding for HotQuarks2004 conference. 11 pages, 8 figures,
corrected for typo
Comment on "Why quantum mechanics cannot be formulated as a Markov process"
In the paper with the above title, D. T. Gillespie [Phys. Rev. A 49, 1607,
(1994)] claims that the theory of Markov stochastic processes cannot provide an
adequate mathematical framework for quantum mechanics. In conjunction with the
specific quantum dynamics considered there, we give a general analysis of the
associated dichotomic jump processes. If we assume that Gillespie's
"measurement probabilities" \it are \rm the transition probabilities of a
stochastic process, then the process must have an invariant (time independent)
probability measure. Alternatively, if we demand the probability measure of the
process to follow the quantally implemented (via the Born statistical
postulate) evolution, then we arrive at the jump process which \it can \rm be
interpreted as a Markov process if restricted to a suitable duration time.
However, there is no corresponding Markov process consistent with the
event space assumption, if we require its existence for all times .Comment: Latex file, resubm. to Phys. Rev.
Inflammatory markers as prognostic factors of survival in patients affected by hepatocellular carcinoma undergoing transarterial chemoembolization
Transarterial chemoembolization (TACE) is a good choice for hepatocellular carcinoma (HCC) treatment when surgery and liver transplantation are not feasible. Few studies reported the value of prognostic factors influencing survival after chemoembolization. In this study, we evaluated whether preoperative inflammatory factors such as neutrophil to lymphocyte ratio and platelet to lymphocyte ratio affected our patient survival when affected by hepatocellular carcinoma. Methods. We retrospectively evaluated a total of 72 patients with hepatocellular carcinoma that underwent TACE. We enrolled patients with different etiopathogeneses of hepatitis and histologically proven HCC not suitable for surgery. The overall study population was dichotomized in two groups according to the median NLR value and was analyzed also according to other prognostic factors. Results. The global median overall survival (OS) was 28 months. The OS in patients with high NLR was statistically significantly shorter than that in patients with low NLR. The following pretreatment variables were significantly associated with the OS in univariate analyses: age, Child-Pugh score, BCLC stage, INR, and NLR. Pretreated high NLR was an independently unfavorable factor for OS. Conclusion. NLR could be considered a good prognostic factor of survival useful to stratify patients that could benefit from TACE treatment
Metabolic reprogramming promotes myogenesis during aging
Sarcopenia is the age-related progressive loss of skeletal muscle mass and strength finally leading to poor physical performance. Impaired myogenesis contributes to the pathogenesis of sarcopenia, while mitochondrial dysfunctions are thought to play a primary role in skeletal muscle loss during aging. Here we studied the link between myogenesis and metabolism. In particular, we analyzed the effect of the metabolic modulator trimetazidine (TMZ) on myogenesis in aging. We show that reprogramming the metabolism by TMZ treatment for 12 consecutive days stimulates myogenic gene expression in skeletal muscle of 22-month-old mice. Our data also reveal that TMZ increases the levels of mitochondrial proteins and stimulates the oxidative metabolism in aged muscles, this finding being in line with our previous observations in cachectic mice. Moreover, we show that, besides TMZ also other types of metabolic modulators (i.e., 5-Aminoimidazole-4-Carboxamide Ribofuranoside-AICAR) can stimulate differentiation of skeletal muscle progenitors in vitro. Overall, our results reveal that reprogramming the metabolism stimulates myogenesis while triggering mitochondrial proteins synthesis in vivo during aging. Together with the previously reported ability of TMZ to increase muscle strength in aged mice, these new data suggest an interesting non-invasive therapeutic strategy which could contribute to improving muscle quality and neuromuscular communication in the elderly, and counteracting sarcopenia
Histological and immunohistochemical evaluation of mandibular bone tissue regeneration
The purpose of the study was to perform an immunohistochemical and histological evaluation of samples taken from different bone regeneration procedures in atrophic human mandible. 30 patients (15 men and 15 women, age range of 35-60 years), non-smokers, with good general and oral health were recruited in this study and divided into three groups. The first group included patients who were treated with blood Concentration Growth Factors (bCGF), the second group included patients who were treated with a mixture of bCGF and autologous bone, while the third group of patients was treated with bCGF and tricalcium phosphate/hydroxyapatite (TCP-HA). Six months after the regenerative procedures, all patients undergone implant surgery, and a bone biopsy was carried out in the site of implant insertion. Each sample was histologically and immunohistochemically examined. Histological evaluation showed a complete bone formation for group II, partial ossification for group I, and moderate ossification for group III. Immunohistochemical analysis demonstrated a statistically significant difference between the three groups, and the best clinical result was obtained with a mixture of bCGF and autologous bone
Ammonia as a Carbon-Free Energy Carrier: NH3 Cracking to H2
In the energy transition from fossil fuels to renewables,hydrogenis a realistic alternative to achieving the decarbonization target.However, its chemical and physical properties make its storage andtransport expensive. To ensure the cost-effective H-2 usageas an energy vector, other chemicals are getting attention as H-2 carriers. Among them, ammonia is the most promising candidate.The value chain of NH3 as a H-2 carrier, consideringthe long-distance ship transport, includes NH3 synthesisand storage at the loading terminal, NH3 storage at theunloading terminal, and its cracking to release H-2. NH3 synthesis and cracking are the cost drivers of the valuechain. Also, the NH3 cracking at large scale is not a maturetechnology, and a significant effort has to be made in intensifyingthe process as much as possible. In this respect, this work reviewsthe available technologies for NH3 cracking, criticallyanalyzing them in view of the scale up to the industrial level
- …