2,656 research outputs found

    Analysis of dependence among size, rate and duration in internet flows

    Get PDF
    In this paper we examine rigorously the evidence for dependence among data size, transfer rate and duration in Internet flows. We emphasize two statistical approaches for studying dependence, including Pearson's correlation coefficient and the extremal dependence analysis method. We apply these methods to large data sets of packet traces from three networks. Our major results show that Pearson's correlation coefficients between size and duration are much smaller than one might expect. We also find that correlation coefficients between size and rate are generally small and can be strongly affected by applying thresholds to size or duration. Based on Transmission Control Protocol connection startup mechanisms, we argue that thresholds on size should be more useful than thresholds on duration in the analysis of correlations. Using extremal dependence analysis, we draw a similar conclusion, finding remarkable independence for extremal values of size and rate.Comment: Published in at http://dx.doi.org/10.1214/09-AOAS268 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Networking - A Statistical Physics Perspective

    Get PDF
    Efficient networking has a substantial economic and societal impact in a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption require new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with non-linear large scale systems. This paper aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications.Comment: (Review article) 71 pages, 14 figure

    Neyman-Pearson Decision in Traffic Analysis

    Get PDF
    The increase of encrypted traffic on the Internet may become a problem for network-security applications such as intrusion-detection systems or interfere with forensic investigations. This fact has increased the awareness for traffic analysis, i.e., inferring information from communication patterns instead of its content. Deciding correctly that a known network flow is either the same or part of an observed one can be extremely useful for several network-security applications such as intrusion detection and tracing anonymous connections. In many cases, the flows of interest are relayed through many nodes that reencrypt the flow, making traffic analysis the only possible solution. There exist two well-known techniques to solve this problem: passive traffic analysis and flow watermarking. The former is undetectable but in general has a much worse performance than watermarking, whereas the latter can be detected and modified in such a way that the watermark is destroyed. In the first part of this dissertation we design techniques where the traffic analyst (TA) is one end of an anonymous communication and wants to deanonymize the other host, under this premise that the arrival time of the TA\u27s packets/requests can be predicted with high confidence. This, together with the use of an optimal detector, based on Neyman-Pearson lemma, allow the TA deanonymize the other host with high confidence even with short flows. We start by studying the forensic problem of leaving identifiable traces on the log of a Tor\u27s hidden service, in this case the used predictor comes in the HTTP header. Afterwards, we propose two different methods for locating Tor hidden services, the first one is based on the arrival time of the request cell and the second one uses the number of cells in certain time intervals. In both of these methods, the predictor is based on the round-trip time and in some cases in the position inside its burst, hence this method does not need the TA to have access to the decrypted flow. The second part of this dissertation deals with scenarios where an accurate predictor is not feasible for the TA. This traffic analysis technique is based on correlating the inter-packet delays (IPDs) using a Neyman-Pearson detector. Our method can be used as a passive analysis or as a watermarking technique. This algorithm is first made robust against adversary models that add chaff traffic, split the flows or add random delays. Afterwards, we study this scenario from a game-theoretic point of view, analyzing two different games: the first deals with the identification of independent flows, while the second one decides whether a flow has been watermarked/fingerprinted or not

    Analysis of Buffer Starvation with Application to Objective QoE Optimization of Streaming Services

    Get PDF
    Our purpose in this paper is to characterize buffer starvations for streaming services. The buffer is modeled as an M/M/1 queue, plus the consideration of bursty arrivals. When the buffer is empty, the service restarts after a certain amount of packets are \emph{prefetched}. With this goal, we propose two approaches to obtain the \emph{exact distribution} of the number of buffer starvations, one of which is based on \emph{Ballot theorem}, and the other uses recursive equations. The Ballot theorem approach gives an explicit result. We extend this approach to the scenario with a constant playback rate using T\`{a}kacs Ballot theorem. The recursive approach, though not offering an explicit result, can obtain the distribution of starvations with non-independent and identically distributed (i.i.d.) arrival process in which an ON/OFF bursty arrival process is considered in this work. We further compute the starvation probability as a function of the amount of prefetched packets for a large number of files via a fluid analysis. Among many potential applications of starvation analysis, we show how to apply it to optimize the objective quality of experience (QoE) of media streaming, by exploiting the tradeoff between startup/rebuffering delay and starvations.Comment: 9 pages, 7 figures; IEEE Infocom 201

    Modern Random Access for Satellite Communications

    Full text link
    The present PhD dissertation focuses on modern random access (RA) techniques. In the first part an slot- and frame-asynchronous RA scheme adopting replicas, successive interference cancellation and combining techniques is presented and its performance analysed. The comparison of both slot-synchronous and asynchronous RA at higher layer, follows. Next, the optimization procedure, for slot-synchronous RA with irregular repetitions, is extended to the Rayleigh block fading channel. Finally, random access with multiple receivers is considered.Comment: PhD Thesis, 196 page

    AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments

    Get PDF
    This report considers the application of Articial Intelligence (AI) techniques to the problem of misuse detection and misuse localisation within telecommunications environments. A broad survey of techniques is provided, that covers inter alia rule based systems, model-based systems, case based reasoning, pattern matching, clustering and feature extraction, articial neural networks, genetic algorithms, arti cial immune systems, agent based systems, data mining and a variety of hybrid approaches. The report then considers the central issue of event correlation, that is at the heart of many misuse detection and localisation systems. The notion of being able to infer misuse by the correlation of individual temporally distributed events within a multiple data stream environment is explored, and a range of techniques, covering model based approaches, `programmed' AI and machine learning paradigms. It is found that, in general, correlation is best achieved via rule based approaches, but that these suffer from a number of drawbacks, such as the difculty of developing and maintaining an appropriate knowledge base, and the lack of ability to generalise from known misuses to new unseen misuses. Two distinct approaches are evident. One attempts to encode knowledge of known misuses, typically within rules, and use this to screen events. This approach cannot generally detect misuses for which it has not been programmed, i.e. it is prone to issuing false negatives. The other attempts to `learn' the features of event patterns that constitute normal behaviour, and, by observing patterns that do not match expected behaviour, detect when a misuse has occurred. This approach is prone to issuing false positives, i.e. inferring misuse from innocent patterns of behaviour that the system was not trained to recognise. Contemporary approaches are seen to favour hybridisation, often combining detection or localisation mechanisms for both abnormal and normal behaviour, the former to capture known cases of misuse, the latter to capture unknown cases. In some systems, these mechanisms even work together to update each other to increase detection rates and lower false positive rates. It is concluded that hybridisation offers the most promising future direction, but that a rule or state based component is likely to remain, being the most natural approach to the correlation of complex events. The challenge, then, is to mitigate the weaknesses of canonical programmed systems such that learning, generalisation and adaptation are more readily facilitated

    Network coding-aided MAC protocols for cooperative wireless networks

    Get PDF
    The introduction of third generation (3G) technologies has caused a vast proliferation of wireless devices and networks, generating an increasing demand for high level Quality of Service (QoS). The wide spread of mobile applications has further reinforced the user need for communication, motivating at the same time the concepts of user cooperation and data dissemination. However, this trend towards continuous exchange of information and ubiquitous connectivity is inherently restricted by the energy-greedy functionalities of high-end devices. These limitations, along with the pressure exerted on the Information and Communications Technology (ICT) industry towards energy awareness, have induced the design of novel energy efficient schemes and algorithms. In this context, the Medium Access Control (MAC) layer plays a key role, since it is mainly responsible for the channel access regulation, the transmission scheduling and the resource allocation, thus constituting an appropriate point to effectively address energy efficiency issues that arise due to the users overcrowding. This dissertation provides a contribution to the design, analysis and evaluation of novel MAC protocols for cooperative wireless networks. In our attempt to design energy efficient MAC schemes, we were extensively assisted by the introduction of new techniques, such as Network Coding (NC), that intrinsically bring considerable gains in system performance. The main thesis contributions are divided into two parts. The first part presents NCCARQ, a novel NC-aided Cooperative Automatic Repeat reQuest (ARQ) MAC protocol for wireless networks. NCCARQ introduces a new access paradigm for cooperative ARQ schemes, exploiting NC benefits in bidirectional communication among wireless users. The NCCARQ performance in terms of QoS and energy efficiency is assessed by means of analytical probabilistic models and extensive computer-based simulations, revealing the significant gains we can achieve compared to standardized MAC solutions. In addition, the impact of realistic wireless channel conditions on the MAC protocol operation further motivated us to study the NCCARQ performance in wireless links affected by correlated shadowing, showing that the channel correlation may adversely affect the distributed cooperation benefits. The second part of the thesis is dedicated to the investigation of MAC issues in wireless data dissemination scenarios. In particular, the existence of multiple source nodes in such scenarios generates conflicting situations, considering the selfish behavior of the wireless devices that want to maximize their battery lifetime. Bearing in mind the energy efficiency importance, we propose game theoretic medium access strategies, applying energy-based utility functions which inherently imply energy awareness. In addition, Random Linear NC (RLNC) techniques are adopted to eliminate the need of exchanging excessive control packets, while Analog NC (ANC) is employed to efface the impact of collisions throughout the communication. During the elaboration of this thesis, two general key conclusions have been extracted. First, there is a fundamental requirement for implementation of new MAC protocols in order to effectively deal with state-of-the-art techniques (e.g., NC), recently introduced to enhance both the performance and the energy efficiency of the network. Second, we highlight the importance of designing novel energy efficient MAC protocols, taking into account that traditional approaches - designed mainly to assist the collision avoidance in wireless networks - tend to be obsolete.La presente tesis doctoral contribuye al diseño, análisis y evaluación de nuevos protocolos MAC cooperativos para redes inalámbricas. La introducción de nuevas técnicas, tales como la codificación de red (NC), que intrínsecamente llevan un considerable aumento en el rendimiento del sistema, nos ayudó ampliamente durante el diseño de protocolos MAC energéticamente eficientes. Las principales contribuciones de esta tesis se dividen en dos partes. La primera parte presenta el NCCARQ, un protocolo cooperativo de retransmisión automática (ARQ), asistido por NC para redes inalámbricas. La segunda parte de la tesis se centra en el diseño de protocolos de capa MAC en escenarios inalámbricos de difusión de datos. Teniendo en cuenta la importancia de la eficiencia energética, se proponen técnicas de acceso al medio basadas en teoría de juegos dónde las funciones objetivo están motivadas por el consumo energético. Las soluciones propuestas son evaluadas por medio de modelos analíticos y simulaciones por ordenador
    • …
    corecore