385 research outputs found

    Intelligent Management and Efficient Operation of Big Data

    Get PDF
    This chapter details how Big Data can be used and implemented in networking and computing infrastructures. Specifically, it addresses three main aspects: the timely extraction of relevant knowledge from heterogeneous, and very often unstructured large data sources, the enhancement on the performance of processing and networking (cloud) infrastructures that are the most important foundational pillars of Big Data applications or services, and novel ways to efficiently manage network infrastructures with high-level composed policies for supporting the transmission of large amounts of data with distinct requisites (video vs. non-video). A case study involving an intelligent management solution to route data traffic with diverse requirements in a wide area Internet Exchange Point is presented, discussed in the context of Big Data, and evaluated.Comment: In book Handbook of Research on Trends and Future Directions in Big Data and Web Intelligence, IGI Global, 201

    Big Data for Traffic Monitoring and Management

    Get PDF
    The last two decades witnessed tremendous advances in the Information and Com- munications Technologies. Beside improvements in computational power and storage capacity, communication networks carry nowadays an amount of data which was not envisaged only few years ago. Together with their pervasiveness, network complexity increased at the same pace, leaving operators and researchers with few instruments to understand what happens in the networks, and, on the global scale, on the Internet. Fortunately, recent advances in data science and machine learning come to the res- cue of network analysts, and allow analyses with a level of complexity and spatial/tem- poral scope not possible only 10 years ago. In my thesis, I take the perspective of an In- ternet Service Provider (ISP), and illustrate challenges and possibilities of analyzing the traffic coming from modern operational networks. I make use of big data and machine learning algorithms, and apply them to datasets coming from passive measurements of ISP and University Campus networks. The marriage between data science and network measurements is complicated by the complexity of machine learning algorithms, and by the intrinsic multi-dimensionality and variability of this kind of data. As such, my work proposes and evaluates novel techniques, inspired from popular machine learning approaches, but carefully tailored to operate with network traffic. In this thesis, I first provide a thorough characterization of the Internet traffic from 2013 to 2018. I show the most important trends in the composition of traffic and users’ habits across the last 5 years, and describe how the network infrastructure of Internet big players changed in order to support faster and larger traffic. Then, I show the chal- lenges in classifying network traffic, with particular attention to encryption and to the convergence of Internet around few big players. To overcome the limitations of classical approaches, I propose novel algorithms for traffic classification and management lever- aging machine learning techniques, and, in particular, big data approaches. Exploiting temporal correlation among network events, and benefiting from large datasets of op- erational traffic, my algorithms learn common traffic patterns of web services, and use them for (i) traffic classification and (ii) fine-grained traffic management. My proposals are always validated in experimental environments, and, then, deployed in real opera- tional networks, from which I report the most interesting findings I obtain. I also focus on the Quality of Experience (QoE) of web users, as their satisfaction represents the final objective of computer networks. Again, I show that using big data approaches, the network can achieve visibility on the quality of web browsing of users. In general, the algorithms I propose help ISPs have a detailed view of traffic that flows in their network, allowing fine-grained traffic classification and management, and real-time monitoring of users QoE

    Intrusion Detection and Countermeasure of Virtual Cloud Systems - State of the Art and Current Challenges

    Get PDF
    Clouds are distributed Internet-based platforms that provide highly resilient and scalable environments to be used by enterprises in a multitude of ways. Cloud computing offers enterprises technology innovation that business leaders and IT infrastructure managers can choose to apply based on how and to what extent it helps them fulfil their business requirements. It is crucial that all technical consultants have a rigorous understanding of the ramifications of cloud computing as its influence is likely to spread the complete IT landscape. Security is one of the major concerns that is of practical interest to decision makers when they are making critical strategic operational decisions. Distributed Denial of Service (DDoS) attacks are becoming more frequent and effective over the past few years, since the widely publicised DDoS attacks on the financial services industry that came to light in September and October 2012 and resurfaced in the past two years. In this paper, we introduce advanced cloud security technologies and practices as a series of concepts and technology architectures, from an industry-centric point of view. This is followed by classification of intrusion detection and prevention mechanisms that can be part of an overall strategy to help understand identify and mitigate potential DDoS attacks on business networks. The paper establishes solid coverage of security issues related to DDoS and virtualisation with a focus on structure, clarity, and well-defined blocks for mainstream cloud computing security solutions and platforms. In doing so, we aim to provide industry technologists, who may not be necessarily cloud or security experts, with an effective tool to help them understand the security implications associated with cloud adoption in their transition towards more knowledge-based systems

    Five Years at the Edge: Watching Internet from the ISP Network

    Get PDF
    The Internet and the way people use it are constantly changing. Knowing traffic is crucial for operating the network, understanding users' need, and ultimately improving applications. Here, we provide an in-depth longitudinal view of Internet traffic in the last 5 years (from 2013 to 2017). We take the point of the view of a national-wide ISP and analyze flow-level rich measurements to pinpoint and quantify trends. We evaluate the providers' costs in terms of traffic consumption by users and services. We show that an ordinary broadband subscriber nowadays downloads more than twice as much as they used to do 5 years ago. Bandwidth hungry video services drive this change, while social messaging applications boom (and vanish) at incredible pace. We study how protocols and service infrastructures evolve over time, highlighting unpredictable events that may hamper traffic management policies. In the rush to bring servers closer and closer to users, we witness the birth of the sub-millisecond Internet, with caches located directly at ISP edges. The picture we take shows a lively Internet that always evolves and suddenly changes

    Strengthening Privacy and Cybersecurity through Anonymization and Big Data

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen

    A roadmap towards improving managed security services from a privacy perspective

    Get PDF
    Published version of an article in the journal: Ethics and Information Technology. Also available from the publisher at: http://dx.doi.org/10.1007/s10676-014-9348-3This paper proposes a roadmap for how privacy leakages from outsourced managed security services using intrusion detection systems can be controlled. The paper first analyses the risk of leaking private or confidential information from signature-based intrusion detection systems. It then discusses how the situation can be improved by developing adequate privacy enforcement methods and privacy leakage metrics in order to control and reduce the leakage of private and confidential information over time. Such metrics should allow for quantifying how much information that is leaking, where these information leakages are, as well as showing what these leakages mean. This includes adding enforcement mechanisms ensuring that operation on sensitive information is transparent and auditable. The data controller or external quality assurance organisations can then verify or certify that the security operation operates in a privacy friendly manner. The roadmap furthermore outlines how privacy-enhanced intrusion detection systems should be implemented by initially providing privacy-enhanced alarm handling and then gradually extending support for privacy enhancing operation to other areas like digital forensics, exchange of threat information and big data analytics based attack detection
    • …
    corecore