6,786 research outputs found
Renormalization Group Improved Exponentiation of Soft Gluons in QCD
We extend the methods of Yennie, Frautschi and Suura to QCD for the summation
of soft gluon effects in which infrared singularities are cancelled to all
orders in . An explicit formula for the respective \rngp improved
exponentiated cross section is obtained for q+\bbar{{q'}}\to q+\bbar{{q'}}+
n(G) at SSC energies. Possible applications are discussed.Comment: 7 pages (1 figure not included, available on request) LATEX,
UTHEP-93-040
Bounding inconsistency using a novel threshold metric for dead reckoning update packet generation
Human-to-human interaction across distributed applications requires that sufficient consistency be maintained among participants in the face of network characteristics such as latency and limited bandwidth. The level of inconsistency arising from the network is proportional to the network delay, and thus a function of bandwidth consumption. Distributed simulation has often used a bandwidth reduction technique known as dead reckoning that combines approximation and estimation in the communication of entity movement to reduce network traffic, and thus improve consistency. However, unless carefully tuned to application and network characteristics, such an approach can introduce more inconsistency than it avoids. The key tuning metric is the distance threshold. This paper questions the suitability of the standard distance threshold as a metric for use in the dead reckoning scheme. Using a model relating entity path curvature and inconsistency, a major performance related limitation of the distance threshold technique is highlighted. We then propose an alternative time—space threshold criterion. The time—space threshold is demonstrated, through simulation, to perform better for low curvature movement. However, it too has a limitation. Based on this, we further propose a novel hybrid scheme. Through simulation and live trials, this scheme is shown to perform well across a range of curvature values, and places bounds on both the spatial and absolute inconsistency arising from dead reckoning
Exploring the use of local consistency measures as thresholds for dead reckoning update packet generation
Human-to-human interaction across distributed applications requires that sufficient consistency be maintained among participants in the face of network characteristics such as latency and limited bandwidth. Techniques and approaches for reducing bandwidth usage can minimize network delays by reducing the network traffic and therefore better exploiting available bandwidth. However, these approaches induce inconsistencies within the level of human perception. Dead reckoning is a well-known technique for reducing the number of update packets transmitted between participating nodes. It employs a distance threshold for deciding when to generate update packets. This paper questions the use of such a distance threshold in the context of absolute consistency and it highlights a major drawback with such a technique. An alternative threshold criterion based on time and distance is examined and it is compared to the distance only threshold. A drawback with this proposed technique is also identified and a hybrid threshold criterion is then proposed. However, the trade-off between spatial and temporal inconsistency remains
Radiative corrections in processes at the SSC
We discuss radiative corrections for interactions in the SSC environment.
Based on the theory of Yennie, Frautschi and Suura, we develop appropriate
Monte Carlo event generators to compute the background electromagnetic
radiation. Our results indicate that multiple-photon effects must be taken into
account in the study of SSC physics such as Higgs decay.Comment: UTHEP-92-0901, 15 pages (incl. 3 figures), LaTeX (Talk presented at
the XXXII Cracow School of Theoretical Physics, Zakopane, June 1992
- …