39,410 research outputs found

    Resource dimensioning through buffer sampling

    Get PDF
    Link dimensioning, i.e., selecting a (minimal) link capacity such that the users’ performance requirements are met, is a crucial component of network design. It requires insight into the interrelationship among the traffic offered (in terms of the mean offered load , but also its fluctuation around the mean, i.e., ‘burstiness’), the envisioned performance level, and the capacity needed. We first derive, for different performance criteria, theoretical dimensioning formulas that estimate the required capacity cc as a function of the input traffic and the performance target. For the special case of Gaussian input traffic, these formulas reduce to c=M+αVc = M + \alpha V, where directly relates to the performance requirement (as agreed upon in a service level agreement) and VV reflects the burstiness (at the timescale of interest). We also observe that Gaussianity applies for virtually all realistic scenarios; notably, already for a relatively low aggregation level, the Gaussianity assumption is justified.\ud As estimating MM is relatively straightforward, the remaining open issue concerns the estimation of VV. We argue that particularly if corresponds to small time-scales, it may be inaccurate to estimate it directly from the traffic traces. Therefore, we propose an indirect method that samples the buffer content, estimates the buffer content distribution, and ‘inverts’ this to the variance. We validate the inversion through extensive numerical experiments (using a sizeable collection of traffic traces from various representative locations); the resulting estimate of VV is then inserted in the dimensioning formula. These experiments show that both the inversion and the dimensioning formula are remarkably accurate

    Resource dimensioning through buffer sampling

    Get PDF
    Link dimensioning, i.e., selecting a (minimal) link capacity such that the users’ performance requirements are met, is a crucial component of network design. It requires insight into the interrelationship between the traffic offered (in terms of the mean offered load M, but also its fluctuation around the mean, i.e., ‘burstiness’), the envisioned performance level, and the capacity needed. We first derive, for different performance criteria, theoretical dimensioning formulae that estimate the required capacity C as a function of the input traffic and the performance target. For the special case of Gaussian input traffic these formulae reduce to C = M+V , where directly relates to the performance requirement (as agreed upon in a service level agreement) and V reflects the burstiness (at the timescale of interest). We also observe that Gaussianity applies for virtually all realistic scenarios; notably, already for a relatively low aggregation level the Gaussianity assumption is justified.\ud As estimating M is relatively straightforward, the remaining open issue concerns the estimation of V . We argue that, particularly if V corresponds to small time-scales, it may be inaccurate to estimate it directly from the traffic traces. Therefore, we propose an indirect method that samples the buffer content, estimates the buffer content distribution, and ‘inverts’ this to the variance. We validate the inversion through extensive numerical experiments (using a sizeable collection of traffic traces from various representative locations); the resulting estimate of V is then inserted in the dimensioning formula. These experiments show that both the inversion and the dimensioning formula are remarkably accurate

    Estimation of Scale and Hurst Parameters of Semi-Selfsimilar Processes

    Get PDF
    The characteristic feature of semi-selfsimilar process is the invariance of its finite dimensional distributions by certain dilation for specific scaling factor. Estimating the scale parameter λ\lambda and the Hurst index of such processes is one of the fundamental problem in the literature. We present some iterative method for estimation of the scale and Hurst parameters which is addressed for semi-selfsimilar processes with stationary increments. This method is based on some flexible sampling scheme and evaluating sample variance of increments in each scale intervals [λn1,λn)[\lambda^{n-1}, \lambda^n), nNn\in \mathbb{N}. For such iterative method we find the initial estimation for the scale parameter by evaluating cumulative sum of moving sample variances and also by evaluating sample variance of preceding and succeeding moving sample variance of increments. We also present a new efficient method for estimation of Hurst parameter of selfsimilar processes. As an example we introduce simple fractional Brownian motion (sfBm) which is semi-selfsimilar with stationary increments. We present some simulations and numerical evaluation to illustrate the results and to estimate the scale for sfBm as a semi-selfsimilar process. We also present another simulation and show the efficiency of our method in estimation of Hurst parameter by comparing its performance with some previous methods.Comment: 15 page

    Efficient estimation of blocking probabilities in non-stationary loss networks

    Get PDF
    This paper considers estimation of blocking probabilities in a nonstationary loss network. Invoking the so called MOL (Modified Offered Load) approximation, the problem is transformed into one requiring the solution of blocking probabilities in a sequence of stationary loss networks with time varying loads. To estimate the blocking probabilities Monte Carlo simulation is used and to increase the efficiency of the simulation, we develop a likelihood ratio method that enables samples drawn at a one time point to be used at later time points. This reduces the need to draw new samples every time independently as a new time point is considered, thus giving substantial savings in the computational effort of evaluating time dependent blocking probabilities. The accuracy of the method is analyzed by using Taylor series approximations of the variance indicating the direct dependence of the accuracy on the rate of change of the actual load. Finally, three practical applications of the method are provided along with numerical examples to demonstrate the efficiency of the method

    Measuring Information Leakage in Website Fingerprinting Attacks and Defenses

    Full text link
    Tor provides low-latency anonymous and uncensored network access against a local or network adversary. Due to the design choice to minimize traffic overhead (and increase the pool of potential users) Tor allows some information about the client's connections to leak. Attacks using (features extracted from) this information to infer the website a user visits are called Website Fingerprinting (WF) attacks. We develop a methodology and tools to measure the amount of leaked information about a website. We apply this tool to a comprehensive set of features extracted from a large set of websites and WF defense mechanisms, allowing us to make more fine-grained observations about WF attacks and defenses.Comment: In Proceedings of the 2018 ACM SIGSAC Conference on Computer and Communications Security (CCS '18

    An evaluation of study design for estimating a time-of-day noise weighting

    Get PDF
    The relative importance of daytime and nighttime noise of the same noise level is represented by a time-of-day weight in noise annoyance models. The high correlations between daytime and nighttime noise were regarded as a major reason that previous social surveys of noise annoyance could not accurately estimate the value of the time-of-day weight. Study designs which would reduce the correlation between daytime and nighttime noise are described. It is concluded that designs based on short term variations in nighttime noise levels would not be able to provide valid measures of response to nighttime noise. The accuracy of the estimate of the time-of-day weight is predicted for designs which are based on long term variations in nighttime noise levels. For these designs it is predicted that it is not possible to form satisfactorily precise estimates of the time-of-day weighting

    Development and evaluation of land use regression models for black carbon based on bicycle and pedestrian measurements in the urban environment

    Get PDF
    Land use regression (LUR) modelling is increasingly used in epidemiological studies to predict air pollution exposure. The use of stationary measurements at a limited number of locations to build a LUR model, however, can lead to an overestimation of its predictive abilities. We use opportunistic mobile monitoring to gather data at a high spatial resolution to build LUR models to predict annual average concentrations of black carbon (BC). The models explain a significant part of the variance in BC concentrations. However, the overall predictive performance remains low, due to input uncertainty and lack of predictive variables that can properly capture the complex characteristics of local concentrations. We stress the importance of using an appropriate cross-validation scheme to estimate the predictive performance of the model. By using independent data for the validation and excluding those data also during variable selection in the model building procedure, overly optimistic performance estimates are avoided. (C) 2017 Elsevier Ltd. All rights reserved
    corecore