1,091 research outputs found

    KISS: Stochastic Packet Inspection Classifier for UDP Traffic

    Get PDF
    This paper proposes KISS, a novel Internet classifica- tion engine. Motivated by the expected raise of UDP traffic, which stems from the momentum of Peer-to-Peer (P2P) streaming appli- cations, we propose a novel classification framework that leverages on statistical characterization of payload. Statistical signatures are derived by the means of a Chi-Square-like test, which extracts the protocol "format," but ignores the protocol "semantic" and "synchronization" rules. The signatures feed a decision process based either on the geometric distance among samples, or on Sup- port Vector Machines. KISS is very accurate, and its signatures are intrinsically robust to packet sampling, reordering, and flow asym- metry, so that it can be used on almost any network. KISS is tested in different scenarios, considering traditional client-server proto- cols, VoIP, and both traditional and new P2P Internet applications. Results are astonishing. The average True Positive percentage is 99.6%, with the worst case equal to 98.1,% while results are al- most perfect when dealing with new P2P streaming applications

    Characterization of ISP Traffic: Trends, User Habits, and Access Technology Impact

    Get PDF
    In the recent years, the research community has increased its focus on network monitoring which is seen as a key tool to understand the Internet and the Internet users. Several studies have presented a deep characterization of a particular application, or a particular network, considering the point of view of either the ISP, or the Internet user. In this paper, we take a different perspective. We focus on three European countries where we have been collecting traffic for more than a year and a half through 5 vantage points with different access technologies. This humongous amount of information allows us not only to provide precise, multiple, and quantitative measurements of "What the user do with the Internet" in each country but also to identify common/uncommon patterns and habits across different countries and nations. Considering different time scales, we start presenting the trend of application popularity; then we focus our attention to a one-month long period, and further drill into a typical daily characterization of users activity. Results depict an evolving scenario due to the consolidation of new services as Video Streaming and File Hosting and to the adoption of new P2P technologies. Despite the heterogeneity of the users, some common tendencies emerge that can be leveraged by the ISPs to improve their servic

    Mining Unclassified Traffic Using Automatic Clustering Techniques

    Get PDF
    In this paper we present a fully unsupervised algorithm to identify classes of traffic inside an aggregate. The algorithm leverages on the K-means clustering algorithm, augmented with a mechanism to automatically determine the number of traffic clusters. The signatures used for clustering are statistical representations of the application layer protocols. The proposed technique is extensively tested considering UDP traffic traces collected from operative networks. Performance tests show that it can clusterize the traffic in few tens of pure clusters, achieving an accuracy above 95%. Results are promising and suggest that the proposed approach might effectively be used for automatic traffic monitoring, e.g., to identify the birth of new applications and protocols, or the presence of anomalous or unexpected traffi

    40 Gbps Access for Metro networks: Implications in terms of Sustainability and Innovation from an LCA Perspective

    Full text link
    In this work, the implications of new technologies, more specifically the new optical FTTH technologies, are studied both from the functional and non-functional perspectives. In particular, some direct impacts are listed in the form of abandoning non-functional technologies, such as micro-registration, which would be implicitly required for having a functioning operation before arrival the new high-bandwidth access technologies. It is shown that such abandonment of non-functional best practices, which are mainly at the management level of ICT, immediately results in additional consumption and environmental footprint, and also there is a chance that some other new innovations might be 'missed.' Therefore, unconstrained deployment of these access technologies is not aligned with a possible sustainable ICT picture, except if they are regulated. An approach to pricing the best practices, including both functional and non-functional technologies, is proposed in order to develop a regulation and policy framework for a sustainable broadband access.Comment: 10 pages, 6 Tables, 1 Figure. Accepted to be presented at the ICT4S'15 Conferenc

    Dynamic Resource Management in Clouds: A Probabilistic Approach

    Full text link
    Dynamic resource management has become an active area of research in the Cloud Computing paradigm. Cost of resources varies significantly depending on configuration for using them. Hence efficient management of resources is of prime interest to both Cloud Providers and Cloud Users. In this work we suggest a probabilistic resource provisioning approach that can be exploited as the input of a dynamic resource management scheme. Using a Video on Demand use case to justify our claims, we propose an analytical model inspired from standard models developed for epidemiology spreading, to represent sudden and intense workload variations. We show that the resulting model verifies a Large Deviation Principle that statistically characterizes extreme rare events, such as the ones produced by "buzz/flash crowd effects" that may cause workload overflow in the VoD context. This analysis provides valuable insight on expectable abnormal behaviors of systems. We exploit the information obtained using the Large Deviation Principle for the proposed Video on Demand use-case for defining policies (Service Level Agreements). We believe these policies for elastic resource provisioning and usage may be of some interest to all stakeholders in the emerging context of cloud networkingComment: IEICE Transactions on Communications (2012). arXiv admin note: substantial text overlap with arXiv:1209.515

    CHORUS Deliverable 2.2: Second report - identification of multi-disciplinary key issues for gap analysis toward EU multimedia search engines roadmap

    Get PDF
    After addressing the state-of-the-art during the first year of Chorus and establishing the existing landscape in multimedia search engines, we have identified and analyzed gaps within European research effort during our second year. In this period we focused on three directions, notably technological issues, user-centred issues and use-cases and socio- economic and legal aspects. These were assessed by two central studies: firstly, a concerted vision of functional breakdown of generic multimedia search engine, and secondly, a representative use-cases descriptions with the related discussion on requirement for technological challenges. Both studies have been carried out in cooperation and consultation with the community at large through EC concertation meetings (multimedia search engines cluster), several meetings with our Think-Tank, presentations in international conferences, and surveys addressed to EU projects coordinators as well as National initiatives coordinators. Based on the obtained feedback we identified two types of gaps, namely core technological gaps that involve research challenges, and “enablers”, which are not necessarily technical research challenges, but have impact on innovation progress. New socio-economic trends are presented as well as emerging legal challenges

    Advanced solutions for quality-oriented multimedia broadcasting

    Get PDF
    Multimedia content is increasingly being delivered via different types of networks to viewers in a variety of locations and contexts using a variety of devices. The ubiquitous nature of multimedia services comes at a cost, however. The successful delivery of multimedia services will require overcoming numerous technological challenges many of which have a direct effect on the quality of the multimedia experience. For example, due to dynamically changing requirements and networking conditions, the delivery of multimedia content has traditionally adopted a best effort approach. However, this approach has often led to the end-user perceived quality of multimedia-based services being negatively affected. Yet the quality of multimedia content is a vital issue for the continued acceptance and proliferation of these services. Indeed, end-users are becoming increasingly quality-aware in their expectations of multimedia experience and demand an ever-widening spectrum of rich multimedia-based services. As a consequence, there is a continuous and extensive research effort, by both industry and academia, to find solutions for improving the quality of multimedia content delivered to the users; as well, international standards bodies, such as the International Telecommunication Union (ITU), are renewing their effort on the standardization of multimedia technologies. There are very different directions in which research has attempted to find solutions in order to improve the quality of the rich media content delivered over various network types. It is in this context that this special issue on broadcast multimedia quality of the IEEE Transactions on Broadcasting illustrates some of these avenues and presents some of the most significant research results obtained by various teams of researchers from many countries. This special issue provides an example, albeit inevitably limited, of the richness and breath of the current research on multimedia broadcasting services. The research i- - ssues addressed in this special issue include, among others, factors that influence user perceived quality, encoding-related quality assessment and control, transmission and coverage-based solutions and objective quality measurements
    corecore