51,503 research outputs found
Bounding inconsistency using a novel threshold metric for dead reckoning update packet generation
Human-to-human interaction across distributed applications requires that sufficient consistency be maintained among participants in the face of network characteristics such as latency and limited bandwidth. The level of inconsistency arising from the network is proportional to the network delay, and thus a function of bandwidth consumption. Distributed simulation has often used a bandwidth reduction technique known as dead reckoning that combines approximation and estimation in the communication of entity movement to reduce network traffic, and thus improve consistency. However, unless carefully tuned to application and network characteristics, such an approach can introduce more inconsistency than it avoids. The key tuning metric is the distance threshold. This paper questions the suitability of the standard distance threshold as a metric for use in the dead reckoning scheme. Using a model relating entity path curvature and inconsistency, a major performance related limitation of the distance threshold technique is highlighted. We then propose an alternative time—space threshold criterion. The time—space threshold is demonstrated, through simulation, to perform better for low curvature movement. However, it too has a limitation. Based on this, we further propose a novel hybrid scheme. Through simulation and live trials, this scheme is shown to perform well across a range of curvature values, and places bounds on both the spatial and absolute inconsistency arising from dead reckoning
Leakage of waves from coronal loops by wave tunneling
To better understand the decay of vertically polarised fast kink modes of coronal loops by the mechanism of wave tunneling, simulations are performed of fast kink modes in straight flux slabs which have Alfvén speed profiles which include a tunneling region. The decay rates are found to be determined by the mode number of the trapped mode and the thickness of the tunneling region. Two analytical models are suggested to explain the observed decay. The first is a extension of the work of Roberts (1981, Sol. Phys., 69, 39) to include a finite thickness tunneling region, and the second is a simpler model which yields an analytical solution for the relationship between decay rate, period and the thickness of the tunneling region. The decay rates for these straight slabs are found to be slower than in observations and those found in a previous paper on the subject by Brady & Arber (2005, A&A, 438, 733) using curved flux slabs. It is found that the difference between the straight slabs used here and the curved slabs used in Brady & Arber (2005, A&A, 438, 733) can be represented as a geometric correction to the decay rate
Exploring the use of local consistency measures as thresholds for dead reckoning update packet generation
Human-to-human interaction across distributed applications requires that sufficient consistency be maintained among participants in the face of network characteristics such as latency and limited bandwidth. Techniques and approaches for reducing bandwidth usage can minimize network delays by reducing the network traffic and therefore better exploiting available bandwidth. However, these approaches induce inconsistencies within the level of human perception. Dead reckoning is a well-known technique for reducing the number of update packets transmitted between participating nodes. It employs a distance threshold for deciding when to generate update packets. This paper questions the use of such a distance threshold in the context of absolute consistency and it highlights a major drawback with such a technique. An alternative threshold criterion based on time and distance is examined and it is compared to the distance only threshold. A drawback with this proposed technique is also identified and a hybrid threshold criterion is then proposed. However, the trade-off between spatial and temporal inconsistency remains
A zonal computational procedure adapted to the optimization of two-dimensional thrust augmentor inlets
A viscous-inviscid interaction methodology based on a zonal description of the flowfield is developed as a mean of predicting the performance of two-dimensional thrust augmenting ejectors. An inviscid zone comprising the irrotational flow about the device is patched together with a viscous zone containing the turbulent mixing flow. The inviscid region is computed by a higher order panel method, while an integral method is used for the description of the viscous part. A non-linear, constrained optimization study is undertaken for the design of the inlet region. In this study, the viscous-inviscid analysis is complemented with a boundary layer calculation to account for flow separation from the walls of the inlet region. The thrust-based Reynolds number as well as the free stream velocity are shown to be important parameters in the design of a thrust augmentor inlet
Research into fundamental phenomena associated with spacecraft electrochemical devices, calorimetry of nickel-cadmium cells Progress report, 1 Oct. - 31 Dec. 1967
Calorimetry of nickel cadmium cells for spacecraft electrochemical system
Mentoring to reduce antisocial behaviour in childhood
The effects of social interventions need to be examined in real life situations as well as studie
Dyson-Schwinger Equations - aspects of the pion
The contemporary use of Dyson-Schwinger equations in hadronic physics is
exemplified via applications to the calculation of pseudoscalar meson masses,
and inclusive deep inelastic scattering with a determination of the pion's
valence-quark distribution function.Comment: 4 pages. Contribution to the Proceedings of ``DPF 2000,'' the Meeting
of the Division of Particles and Fields of the American Physical Society,
August 9-12, 2000, Department of Physics, the Ohio State University,
Columbus, Ohi
- …