2 research outputs found

    DIMACS Series in Discrete Mathematics and Theoretical Computer Science Stochastic Modeling of a Single TCP/IP Session over a Random Loss Channel

    No full text
    Abstract. In this paper, we present an analytical framework for modeling the performance of a single TCP session in the presence of random packet loss. This framework may be applicable to communications channels that cause random packet loss modelled by appropriate statistics of the inter-loss duration. It is shown that the analytical model predicts the throughput for LANs/WANs (low and high bandwidth-delay products) with reasonable accuracy, as measured against the throughput obtained by simulation. Random loss is found to severely a ect the network throughput, higher speed channels are found to be more vulnerable to random loss than slower channels, especially for moderate to high loss rates. 1
    corecore