516 research outputs found
Electroweak boson production at small transverse momentum in hadron collisions
The resummation of double-logarithmic perturbative contributions produced by soft- gluon radiation (Sudakov resummation) has proved to be an important tool for enlarging the applications of perturbative QCD to a wider range of kinematical regions. In particular, a complete description of W and Z boson production at high-energy hadron colliders requires the resummation of large double logarithms that dominate the transverse momentum (p(_r)) distribution at small p(_r). This can be performed either directly in transverse momentum space or in impact parameter (Fourier transform) b space. The b space method succeeds in resumming all the leading and sub-leading logarithmic terms, but does not allow a smooth transition to fixed-order dominance at high transverse momenta. In contrast, the pr space approach experiences difficulties with resumming more sub-leading logarithms. This thesis concentrates on developing the p(_r) space formalism which completely resums the first four towers of logarithms. The number of fully resummed towers is the same as for the b space method. The results are compared, both analytically and numerically, with the original b space result as well as with results of other p(_r) space methods. Parametrization of the non-perturbative effects in p(_r) space is discussed. Given recent Tevatron data on Z boson production we find good agreement between the data and the theoretical predictions. Using the same formalism, the transverse momentum distributions are also calculated for W and Z boson production at the LHC. Finally, we discuss production of like-sign W pair production in the context of double parton scattering at the LHC
Q_T Resummation in Transversely Polarized Drell-Yan Process
We calculate QCD corrections to transversely polarized Drell-Yan process at a
measured of the produced lepton pair in the dimensional regularization
scheme. The distribution is discussed resumming soft gluon effects
relevant for small .Comment: 4 pages, 1 figure, contribution to proceedings of International
Conference on QCD and Hadronic Physics, Beijing, June 16-20, 200
A case of a single intracranial vertebral artery and cerebral infarct
The vertebral arteries are commonly affected by anatomical variation. This variation ranges from slight asymmetry in arterial diameter between the right and left sides to complete absence of a vertebral artery on one side. Asymmetry in diameter is a common observation, although complete absence of the artery is rare. Herein, we report on a 79-year-old male anatomical donor who, upon brain removal, was found to have a single intracranial vertebral artery which was the sole source of the basilar artery. During dissection of the neck, both right and left vertebral arteries were identified arising from the subclavian arteries. The vertebral arteries were dissected from the transverse foramina and followed into the skull. The right vertebral artery terminated by supplying the spinal cord, consistent with the distribution of the posterior spinal artery. Such vascular anomalies are clinically significant, as they may lead to abnormal patterns of sensory-motor deficiencies in stroke and are at risk of iatrogenic injury during surgical procedures
High speed development of new chemical synthesis and materials at molecular-level: Methods and approaches
Recent success of advanced computational chemistry, in example for the prediction of chemical reactivity and materials properties, reflects its reputation as a valuable and widely accepted means to tackle problems in academia. The development of new simulation methods and new computer architectures enables an enormous improvement of the productivity of research and development of new chemical synthesisand materials. These advances can be achieved in terms of less time, material, and staff compared to traditional lab experiments. Especially, approaches like virtual high throughput screenings (vHTS) are highly scalable and allow fast and deep insights into new promising system modifications. Consequently, the time to market and risk of new product development can be decreased significantly. These characteristicspaved the way for the successful application in industry nowadays
Recommended from our members
Principles of Explanatory Debugging to personalize interactive machine learning
How can end users efficiently influence the predictions that machine learning systems make on their behalf? This paper presents Explanatory Debugging, an approach in which the system explains to users how it made each of its predictions, and the user then explains any necessary corrections back to the learning system. We present the principles underlying this approach and a prototype instantiating it. An empirical evaluation shows that Explanatory Debugging increased participants' understanding of the learning system by 52% and allowed participants to correct its mistakes up to twice as efficiently as participants using a traditional learning system
Recommended from our members
Explanatory debugging: Supporting end-user debugging of machine-learned programs
Many machine-learning algorithms learn rules of behavior from individual end users, such as task-oriented desktop organizers and handwriting recognizers. These rules form a “program” that tells the computer what to do when future inputs arrive. Little research has explored how an end user can debug these programs when they make mistakes. We present our progress toward enabling end users to debug these learned programs via a Natural Programming methodology. We began with a formative study exploring how users reason about and correct a text-classification program. From the results, we derived and prototyped a concept based on “explanatory debugging”, then empirically evaluated it. Our results contribute methods for exposing a learned program's logic to end users and for eliciting user corrections to improve the program's predictions
Elucidation of role of graphene in catalytic designs for electroreduction of oxygen
Graphene is, in principle, a promising material for consideration as
component (support, active site) of electrocatalytic materials, particularly
with respect to reduction of oxygen, an electrode reaction of importance to
low-temperature fuel cell technology. Different concepts of utilization,
including nanostructuring, doping, admixing, preconditioning, modification or
functionalization of various graphene-based systems for catalytic
electroreduction of oxygen are elucidated, as well as important strategies to
enhance the systems' overall activity and stability are discussed
Coherency and time lag analyses between MODIS vegetation indices and climate across forests and grasslands in the European temperate zone
Identifying the climate-induced variability in the condition of vegetation is particularly important in the context of recent climate change and plants' impact on the mitigation of climate change. In this paper, we present the coherence and time lags in the spectral response of three individual vegetation types in the European temperate zone to the influencing meteorological factors in the period 2002–2022. Vegetation condition in broadleaved forest, coniferous forest and pastures was measured with monthly anomalies of two spectral indices – normalised difference vegetation index (NDVI) and enhanced vegetation index (EVI). As meteorological elements we used monthly anomalies of temperature (T), precipitation (P), vapour pressure deficit (VPD), evapotranspiration (ETo), and the teleconnection indices North Atlantic Oscillation (NAO) and North Sea Caspian Pattern (NCP). Periodicity in the time series was assessed using the wavelet transform, but no significant intra- or interannual cycles were detected in both vegetation (NDVI and EVI) and meteorological variables. In turn, coherence between NDVI and EVI and meteorological elements was described using the methods of wavelet coherence and Pearson's linear correlation with time lag. In the European temperate zone analysed in this study, NAO produces strong coherence mostly for forests in a circa 1-year band and a weaker coherence in a circa 3-year band. For pastures these interannual patterns are hardly recognisable. The strongest relationships occur between conditions of the vegetation and T and ETo – they show high coherence in both forests and pastures. There is a significant cohesion with the 8–16-month (ca. 1-year) and 20–32-month (ca. 2-year) bands. More time-lagged significant correlations between vegetation indices and T occur for forests than for pastures, suggesting a significant lag in the forests' response to the changes in T.</p
Recommended from our members
Too much, too little, or just right? Ways explanations impact end users' mental models
Research is emerging on how end users can correct mistakes their intelligent agents make, but before users can correctly "debug" an intelligent agent, they need some degree of understanding of how it works. In this paper we consider ways intelligent agents should explain themselves to end users, especially focusing on how the soundness and completeness of the explanations impacts the fidelity of end users' mental models. Our findings suggest that completeness is more important than soundness: increasing completeness via certain information types helped participants' mental models and, surprisingly, their perception of the cost/benefit tradeoff of attending to the explanations. We also found that oversimplification, as per many commercial agents, can be a problem: when soundness was very low, participants experienced more mental demand and lost trust in the explanations, thereby reducing the likelihood that users will pay attention to such explanations at all
- …