unknown

Algorithm for Network Delay Estimation Based on End-to-End Data Moment

Abstract

现有时延层析算法大多考虑离散时延模式,但算法效率比较低。为此,提出一种连续时延估计算法,假定链路时延为某参数的函数分布,根据多播特征并基于端到端数据的矩,利用非线性最小二乘法估计链路时延分布函数的参数,并在每步迭代中用一维牛顿搜索确定最优步长,达到快速收敛。应用MATlAb和nS2仿真软件得到的数据表明,该算法所需的存储量少,算法简单且效率较高。The up-to-date algorithms on delay tomography are based on the discrete delay mode,but it is low efficiency in this mode.This paper proposes a continuous delay mode,whose delay is distributed on a function with some parameters.According to the rule of multicast and based on end-to-end moments,the parameters can be estimated using nonlinear least squares and the best step is chosen by the one-dimension Newton search in iteration.Experimental results on Matlab and NS2 simulation show that the algorithm needs little storage and is simple with high efficiency.国家自然科学基金资助项目(41074077;40774065

    Similar works