6,297 research outputs found

    Tomography-based overlay network monitoring

    Get PDF

    NetCluster: a Clustering-Based Framework for Internet Tomography

    Get PDF
    Abstract — In this paper, Internet data collected via passive measurement are analyzed to obtain localization information on nodes by clustering (i.e., grouping together) nodes that exhibit similar network path properties. Since traditional clustering algorithms fail to correctly identify clusters of homogeneous nodes, we propose a novel framework, named “NetCluster”, suited to analyze Internet measurement datasets. We show that the proposed framework correctly analyzes synthetically generated traces. Finally, we apply it to real traces collected at the access link of our campus LAN and discuss the network characteristics as seen at the vantage point. I. INTRODUCTION AND MOTIVATIONS The Internet is a complex distributed system which continues to grow and evolve. The unregulated and heterogeneous structure of the current Internet makes it challenging to obtai

    Large scale probabilistic available bandwidth estimation

    Full text link
    The common utilization-based definition of available bandwidth and many of the existing tools to estimate it suffer from several important weaknesses: i) most tools report a point estimate of average available bandwidth over a measurement interval and do not provide a confidence interval; ii) the commonly adopted models used to relate the available bandwidth metric to the measured data are invalid in almost all practical scenarios; iii) existing tools do not scale well and are not suited to the task of multi-path estimation in large-scale networks; iv) almost all tools use ad-hoc techniques to address measurement noise; and v) tools do not provide enough flexibility in terms of accuracy, overhead, latency and reliability to adapt to the requirements of various applications. In this paper we propose a new definition for available bandwidth and a novel framework that addresses these issues. We define probabilistic available bandwidth (PAB) as the largest input rate at which we can send a traffic flow along a path while achieving, with specified probability, an output rate that is almost as large as the input rate. PAB is expressed directly in terms of the measurable output rate and includes adjustable parameters that allow the user to adapt to different application requirements. Our probabilistic framework to estimate network-wide probabilistic available bandwidth is based on packet trains, Bayesian inference, factor graphs and active sampling. We deploy our tool on the PlanetLab network and our results show that we can obtain accurate estimates with a much smaller measurement overhead compared to existing approaches.Comment: Submitted to Computer Network

    A Network Coding Approach to Loss Tomography

    Get PDF
    Network tomography aims at inferring internal network characteristics based on measurements at the edge of the network. In loss tomography, in particular, the characteristic of interest is the loss rate of individual links and multicast and/or unicast end-to-end probes are typically used. Independently, recent advances in network coding have shown that there are advantages from allowing intermediate nodes to process and combine, in addition to just forward, packets. In this paper, we study the problem of loss tomography in networks with network coding capabilities. We design a framework for estimating link loss rates, which leverages network coding capabilities, and we show that it improves several aspects of tomography including the identifiability of links, the trade-off between estimation accuracy and bandwidth efficiency, and the complexity of probe path selection. We discuss the cases of inferring link loss rates in a tree topology and in a general topology. In the latter case, the benefits of our approach are even more pronounced compared to standard techniques, but we also face novel challenges, such as dealing with cycles and multiple paths between sources and receivers. Overall, this work makes the connection between active network tomography and network coding

    Network monitoring in multicast networks using network coding

    Get PDF
    In this paper we show how information contained in robust network codes can be used for passive inference of possible locations of link failures or losses in a network. For distributed randomized network coding, we bound the probability of being able to distinguish among a given set of failure events, and give some experimental results for one and two link failures in randomly generated networks. We also bound the required field size and complexity for designing a robust network code that distinguishes among a given set of failure events

    A Cooperative Network Monitoring Overlay

    Get PDF
    This paper proposes a ïŹ‚exible network monitoring overlay which resorts to cooperative interaction among measurement points to monitor the quality of network services. The proposed overlay model, which relies on the deïŹnition of representative measurement points, the avoidance of measurement redundancy and a simple measurement methodology as main design goals, is able to articulate intra- and inter-area measurements efïŹciently. The distributed nature of measurement control and data confers to the model the required autonomy, robustness and adaptiveness to accommodate network topology evolution, routing changes or nodes failure. In addition to these characteristics, the avoidance of explicit addressing and routing at the overlay level, and the low-overhead associated with the measurement process constitute a step forward for deploying large scale monitoring solutions. A JAVA prototype was also implemented to test the conceptual model design

    Network Kriging

    Full text link
    Network service providers and customers are often concerned with aggregate performance measures that span multiple network paths. Unfortunately, forming such network-wide measures can be difficult, due to the issues of scale involved. In particular, the number of paths grows too rapidly with the number of endpoints to make exhaustive measurement practical. As a result, it is of interest to explore the feasibility of methods that dramatically reduce the number of paths measured in such situations while maintaining acceptable accuracy. We cast the problem as one of statistical prediction--in the spirit of the so-called `kriging' problem in spatial statistics--and show that end-to-end network properties may be accurately predicted in many cases using a surprisingly small set of carefully chosen paths. More precisely, we formulate a general framework for the prediction problem, propose a class of linear predictors for standard quantities of interest (e.g., averages, totals, differences) and show that linear algebraic methods of subset selection may be used to effectively choose which paths to measure. We characterize the performance of the resulting methods, both analytically and numerically. The success of our methods derives from the low effective rank of routing matrices as encountered in practice, which appears to be a new observation in its own right with potentially broad implications on network measurement generally.Comment: 16 pages, 9 figures, single-space

    The detection and tracking of mine-water pollution from abandoned mines using electrical tomography

    Get PDF
    Increasing emphasis is being placed on the environmental and societal impact of mining, particularly in the EU, where the environmental impacts of abandoned mine sites (spoil heaps and tailings) are now subject to the legally binding Water Framework and Mine Waste Directives. Traditional sampling to monitor the impact of mining on surface waters and groundwater is laborious, expensive and often unrepresentative. In particular, sparse and infrequent borehole sampling may fail to capture the dynamic behaviour associated with important events such as flash flooding, mine-water break-out, and subsurface acid mine drainage. Current monitoring practice is therefore failing to provide the information needed to assess the socio-economic and environmental impact of mining on vulnerable eco-systems, or to give adequate early warning to allow preventative maintenance or containment. BGS has developed a tomographic imaging system known as ALERT ( Automated time-Lapse Electrical Resistivity Tomography) which allows the near real-time measurement of geoelectric properties "on demand", thereby giving early warning of potential threats to vulnerable water systems. Permanent in-situ geoelectric measurements are used to provide surrogate indicators of hydrochemical and hydrogeological properties. The ALERT survey concept uses electrode arrays, permanently buried in shallow trenches at the surface but these arrays could equally be deployed in mine entries or shafts or underground workings. This sensor network is then interrogated from the office by wireless telemetry (e.g: GSM, low-power radio, internet, and satellite) to provide volumetric images of the subsurface at regular intervals. Once installed, no manual intervention is required; data is transmitted automatically according to a pre-programmed schedule and for specific survey parameters, both of which may be varied remotely as conditions change (i.e: an adaptive sampling approach). The entire process from data capture to visualisation on the web-portal is seamless, with no manual intervention. Examples are given where ALERT has been installed and used to remotely monitor (i) seawater intrusion in a coastal aquifer (ii) domestic landfills and contaminated land and (iii) vulnerable earth embankments. The full potential of the ALERT concept for monitoring mine-waste has yet to be demonstrated. However we have used manual electrical tomography surveys to characterise mine-waste pollution at an abandoned metalliferous mine in the Central Wales orefield in the UK. Hydrogeochemical sampling confirms that electrical tomography can provide a reliable surrogate for the mapping and long-term monitoring of mine-water pollution
    • 

    corecore