2,690 research outputs found

    Bounding inconsistency using a novel threshold metric for dead reckoning update packet generation

    Get PDF
    Human-to-human interaction across distributed applications requires that sufficient consistency be maintained among participants in the face of network characteristics such as latency and limited bandwidth. The level of inconsistency arising from the network is proportional to the network delay, and thus a function of bandwidth consumption. Distributed simulation has often used a bandwidth reduction technique known as dead reckoning that combines approximation and estimation in the communication of entity movement to reduce network traffic, and thus improve consistency. However, unless carefully tuned to application and network characteristics, such an approach can introduce more inconsistency than it avoids. The key tuning metric is the distance threshold. This paper questions the suitability of the standard distance threshold as a metric for use in the dead reckoning scheme. Using a model relating entity path curvature and inconsistency, a major performance related limitation of the distance threshold technique is highlighted. We then propose an alternative time—space threshold criterion. The time—space threshold is demonstrated, through simulation, to perform better for low curvature movement. However, it too has a limitation. Based on this, we further propose a novel hybrid scheme. Through simulation and live trials, this scheme is shown to perform well across a range of curvature values, and places bounds on both the spatial and absolute inconsistency arising from dead reckoning

    A Novel Convergence Algorithm for the Hybrid Strategy Model Packet Reduction Technique

    Get PDF
    Several approaches exist for maintaining consistency in Distributed Interactive Applications. Among these are techniques such as dead reckoning which use prediction algorithms to approximate actual user behaviour and thus reduce the number of update packets required to maintain spatial consistency. The Hybrid Strategy Model operates in a similar way, exploiting long-term patterns in user behaviour whenever possible. Otherwise it simply adopts a short-term model. A major problem with these techniques is the reconstruction of the local behaviour at a remote node. Using the modelled dynamics directly can result in unnatural and sudden jumps in position where updates occur. Convergence algorithms are thus required to smoothly reconstruct remote behaviour from discontinuous samples of the actual local behaviour. This paper makes two important contributions. Primarily, it proposes a novel convergence approach for the Hybrid Strategy Model. Secondly, and more fundamentally, it exposes a lack of suitable and quantifiable measures of different convergence techniques. In this paper the standard smoothing algorithm employed by DIS is used as a benchmark for comparison purposes

    A Novel Convergence Algorithm for the Hybrid Strategy Model Packet Reduction Technique

    Get PDF
    Several approaches exist for maintaining consistency in Distributed Interactive Applications. Among these are techniques such as dead reckoning which use prediction algorithms to approximate actual user behaviour and thus reduce the number of update packets required to maintain spatial consistency. The Hybrid Strategy Model operates in a similar way, exploiting long-term patterns in user behaviour whenever possible. Otherwise it simply adopts a short-term model. A major problem with these techniques is the reconstruction of the local behaviour at a remote node. Using the modelled dynamics directly can result in unnatural and sudden jumps in position where updates occur. Convergence algorithms are thus required to smoothly reconstruct remote behaviour from discontinuous samples of the actual local behaviour. This paper makes two important contributions. Primarily, it proposes a novel convergence approach for the Hybrid Strategy Model. Secondly, and more fundamentally, it exposes a lack of suitable and quantifiable measures of different convergence techniques. In this paper the standard smoothing algorithm employed by DIS is used as a benchmark for comparison purposes

    Dynamic Hybrid Strategy Models for Networked Mulitplayer Games

    Get PDF
    Two of the primary factors in the development of networked multiplayer computer games are network latency and network bandwidth. Reducing the effects of network latency helps maintain game-state fidelity, while reducing network bandwidth usage increases the scalability of the game to support more players. The current technique to address these issues is to have each player locally simulate remote objects (e.g. other players). This is known as dead reckoning. Provided the local simulations are accurate to within a given tolerance, dead reckoning reduces the amount of information required to be transmitted between players. This paper presents an extension to the recently proposed Hybrid Strategy Model (HSM) technique, known as the Dynamic Hybrid Strategy Model (DHSM). By dynamically switching between models of user behaviour, the DHSM attempts to improve the prediction capability of the local simulations, allowing them to stay within a given tolerance for a longer amount of time. This can lead to further reductions in the amount of information required to be transmitted. Presented results for the case of a simple first-person shooter (FPS) game demonstrate the validity of the DHSM approach over dead reckoning, leading to a reduction in the number of state update packets sent and indicating significant potential for network traffic reduction in various multiplayer games/simulations

    Statistical Determination of Hybrid Threshold Parameters for Entity State Update Mechanisms in Distributed Interactive Applications

    Get PDF
    Collaboration within a Distributed Interactive Application (DIA) requires that a high level of consistency be maintained between remote hosts. However, this can require large amounts of network resources, which can negatively affect the scalability of the application, and also increase network latency. Predictive models, such as dead reckoning, provide a sufficient level of consistency, whilst reducing network requirements. Dead reckoning traditionally uses a spatial error threshold metric to operate. In previous work, it was shown how the use of the spatial threshold could result in potentially unbounded local absolute inconsistency. To remedy this, a novel time-space threshold was proposed, that placed bounds on local absolute inconsistency. However, use of the time-space threshold could result in unacceptably large spatial inconsistency. A hybrid approach that combined both error threshold measures was then shown to place bounds on both levels of inconsistency. However, choosing suitable threshold values for use within the hybrid scheme has been problematic, as no direct comparisons can be made between the two threshold metrics. In this paper, a novel comparison scheme is proposed. Under this approach, an error threshold look-up table is generated, based on entity speed and equivalent inconsistency measures. Using this look-up table, it is shown how the performance of comparable thresholds is equal on average, from the point of view of network packet generation. These error thresholds are then employed in a hybrid threshold scheme, which is shown to improve overall consistency in comparison to the previous solution of simply using numerically equal threshold value

    Examining the Effects of Time-Space Measures on the Hybrid Strategy Model in Networked Virtual Environments

    Get PDF
    Scalability is an important issue in the design of networked virtual environments (NVEs). In order to achieve scalability, it is essential to minimise the network traffic required to maintain overall consistency in the NVE. A popular method of achieving this is via entity behaviour prediction mechanisms, such as dead reckoning and the hybrid strategy model (HSM). Typically, the performance of such mechanisms is rated by the number of network packets they generate. However, it is also important that their impact on overall consistency is investigated. Absolute consistency is the degree to which different views of a NVE on remote hosts correspond. In previous work, it was shown that the use of a spatial threshold with dead reckoning can result in unbounded local absolute consistency. A solution that employed a time-space error threshold measure was shown to remedy this issue. In this paper, the scope of the time-space measure is extended to include the HSM. It is shown how the HSM can also result in unbounded local absolute inconsistency. A solution that once again incorporates the time-space threshold is examined. However, this approach results in a significant increase in network traffic. To resolve this, a novel extension to the HSM algorithm is presented, which is demonstrated to reduce network traffic, whilst still maintaining a low level of local absolute inconsistency

    Exploring the Effect of Curvature on the Consistency of Dead Reckoned Paths for Different Error Threshold Metrics

    Get PDF
    Dead reckoning is widely employed as an entity update packet reduction technique in Distributed Interactive Applications (DIAs). Such techniques reduce network bandwidth consumption and thus limit the effects of network latency on the consistency of networked simulations. A key component of the dead reckoning method is the underlying error threshold metric, as this directly determines when an entity update packet is to be sent between local and remote users. The most common metric is the spatial threshold, which is simply based on the distance between a local user’s actual position and their predicted position. Other, recently proposed, metrics include the time-space threshold and the hybrid threshold, both of which are summarised within. This paper investigates the issue of user movement in relation to dead reckoning and each of the threshold metrics. In particular the relationship between the curvature of movement, the various threshold metrics and absolute consistency is studied. Experimental live trials across the Internet allow a comparative analysis of how users behave when different threshold metrics are used with varying degrees of curvature. The presented results provide justification for the use of a hybrid threshold approach when dead reckoning is employed in DIAs

    Exploring the Spatial Density of Strategy Models in a Realistic Distributed Interactive Application

    Get PDF
    As Distributed Interactive Applications (DIAs) become increasingly more prominent in the video game industry they must scale to accommodate progressively more users and maintain a globally consistent worldview. However, network constraints, such as bandwidth, limit the amount of communication allowed between users. Several methods of reducing network communication packets, while maintaining consistency, exist. These include dead reckoning and the hybrid strategy-based modelling approach. This latter method combines a short-term model such as dead reckoning with a long-term strategy model of user behaviour. By employing the strategy that most closely represents user behaviour, a reduction in the number of network packets that must be transmitted to maintain consistency has been shown. In this paper a novel method for constructing multiple long-term strategies using dead reckoning and polygons is described. Furthermore the algorithms are implemented in an industry-proven game engine known as Torque. A series of experiments are executed to investigate the effects of varying the spatial density of strategy models on the number of packets that need to be transmitted to maintain the global consistency of the DIA. The results show that increasing the spatial density of strategy models allows a higher consistency to be achieved with fewer packets using the hybrid strategy-based model than with pure dead reckoning. In some cases, the hybrid strategy-based model completely replaces dead reckoning as a means of communicating updates

    Exploring the Spatial Density of Strategy Models in a Realistic Distributed Interactive Application

    Get PDF
    As Distributed Interactive Applications (DIAs) become increasingly more prominent in the video game industry they must scale to accommodate progressively more users and maintain a globally consistent worldview. However, network constraints, such as bandwidth, limit the amount of communication allowed between users. Several methods of reducing network communication packets, while maintaining consistency, exist. These include dead reckoning and the hybrid strategy-based modelling approach. This latter method combines a short-term model such as dead reckoning with a long-term strategy model of user behaviour. By employing the strategy that most closely represents user behaviour, a reduction in the number of network packets that must be transmitted to maintain consistency has been shown. In this paper a novel method for constructing multiple long-term strategies using dead reckoning and polygons is described. Furthermore the algorithms are implemented in an industry-proven game engine known as Torque. A series of experiments are executed to investigate the effects of varying the spatial density of strategy models on the number of packets that need to be transmitted to maintain the global consistency of the DIA. The results show that increasing the spatial density of strategy models allows a higher consistency to be achieved with fewer packets using the hybrid strategy-based model than with pure dead reckoning. In some cases, the hybrid strategy-based model completely replaces dead reckoning as a means of communicating updates

    Does Reducing Packet Transmission Rates Help to Improve Consistency within Distributed Interactive Applications?

    Get PDF
    Networked games are an important class of distributed systems. In order for such applications to be successful, it is important that a sufficient level of consistency is maintained. To achieve this, a high level of network traffic is often required. However, this can cause an increase in network latency due to overloaded network hardware, which, ironically, can have a negative impact on consistency. Entity state prediction techniques aim to combat this effect by reducing network traffic. Although much work has focused on developing predictive schemes, there has been little work to date on the analysis of their true impact on the consistency of the system overall. In this paper, we identify an important performance-related characteristic of packet reduction schemes. It is demonstrated that there exists an optimal packet transmission region. Increasing or decreasing network traffic above or below this level negatively impacts on consistency. Based on this characteristic, it is proposed that predictive schemes exploit this optimal point in order to maximise consistency by efficiently utilising the available resources
    • …
    corecore