109 research outputs found

    A Survey of Methods for Encrypted Traffic Classification and Analysis

    Get PDF
    With the widespread use of encrypted data transport network traffic encryption is becoming a standard nowadays. This presents a challenge for traffic measurement, especially for analysis and anomaly detection methods which are dependent on the type of network traffic. In this paper, we survey existing approaches for classification and analysis of encrypted traffic. First, we describe the most widespread encryption protocols used throughout the Internet. We show that the initiation of an encrypted connection and the protocol structure give away a lot of information for encrypted traffic classification and analysis. Then, we survey payload and feature-based classification methods for encrypted traffic and categorize them using an established taxonomy. The advantage of some of described classification methods is the ability to recognize the encrypted application protocol in addition to the encryption protocol. Finally, we make a comprehensive comparison of the surveyed feature-based classification methods and present their weaknesses and strengths.Šifrování síťového provozu se v dnešní době stalo standardem. To přináší vysoké nároky na monitorování síťového provozu, zejména pak na analýzu provozu a detekci anomálií, které jsou závislé na znalosti typu síťového provozu. V tomto článku přinášíme přehled existujících způsobů klasifikace a analýzy šifrovaného provozu. Nejprve popisujeme nejrozšířenější šifrovací protokoly, a ukazujeme, jakým způsobem lze získat informace pro analýzu a klasifikaci šifrovaného provozu. Následně se zabýváme klasifikačními metodami založenými na obsahu paketů a vlastnostech síťového provozu. Tyto metody klasifikujeme pomocí zavedené taxonomie. Výhodou některých popsaných klasifikačních metod je schopnost rozeznat nejen šifrovací protokol, ale také šifrovaný aplikační protokol. Na závěr porovnáváme silné a slabé stránky všech popsaných klasifikačních metod

    Profiling and Identification of Web Applications in Computer Network

    Get PDF
    Characterising network traffic is a critical step for detecting network intrusion or misuse. The traditional way to identify the application associated with a set of traffic flows uses port number and DPI (Deep Packet Inspection), but it is affected by the use of dynamic ports and encryption. The research community proposed models for traffic classification that determined the most important requirements and recommendations for a successful approach. The suggested alternatives could be categorised into four techniques: port-based, packet payload based, host behavioural, and statistical-based. The traditional way to identifying traffic flows typically focuses on using IANA assigned port numbers and deep packet inspection (DPI). However, an increasing number of Internet applications nowadays that frequently use dynamic post assignments and encryption data traffic render these techniques in achieving real-time traffic identification. In recent years, two other techniques have been introduced, focusing on host behaviour and statistical methods, to avoid these limitations. The former technique is based on the idea that hosts generate different communication patterns at the transport layer; by extracting these behavioural patterns, activities and applications can be classified. However, it cannot correctly identify the application names, classifying both Yahoo and Gmail as email. Thereby, studies have focused on using statistical features approach for identifying traffic associated with applications based on machine learning algorithms. This method relies on characteristics of IP flows, minimising the overhead limitations associated with other schemes. Classification accuracy of statistical flow-based approaches, however, depends on the discrimination ability of the traffic features used. NetFlow represents the de-facto standard in monitoring and analysing network traffic, but the information it provides is not enough to describe the application behaviour. The primary challenge is to describe the activity within entirely and among network flows to understand application usage and user behaviour. This thesis proposes novel features to describe precisely a web application behaviour in order to segregate various user activities. Extracting the most discriminative features, which characterise web applications, is a key to gain higher accuracy without being biased by either users or network circumstances. This work investigates novel and superior features that characterize a behaviour of an application based on timing of arrival packets and flows. As part of describing the application behaviour, the research considered the on/off data transfer, defining characteristics for many typical applications, and the amount of data transferred or exchanged. Furthermore, the research considered timing and patterns for user events as part of a network application session. Using an extended set of traffic features output from traffic captures, a supervised machine learning classifier was developed. To this effect, the present work customised the popular tcptrace utility to generate classification features based on traffic burstiness and periods of inactivity for everyday Internet usage. A C5.0 decision tree classifier is applied using the proposed features for eleven different Internet applications, generated by ten users. Overall, the newly proposed features reported a significant level of accuracy (~98%) in classifying the respective applications. Afterwards, uncontrolled data collected from a real environment for a group of 20 users while accessing different applications was used to evaluate the proposed features. The evaluation tests indicated that the method has an accuracy of 87% in identifying the correct network application.Iraqi cultural Attach

    Resilient power grid for smart city

    Get PDF
    Modern power grid has a fundamental role in the operation of smart cities. However, high impact low probability extreme events bring severe challenges to the security of urban power grid. With an increasing focus on these threats, the resilience of urban power grid has become a prior topic for a modern smart city. A resilient power grid can resist, adapt to, and timely recover from disruptions. It has four characteristics, namely anticipation, absorption, adaptation, and recovery. This paper aims to systematically investigate the development of resilient power grid for smart city. Firstly, this paper makes a review on the high impact low probability extreme events categories that influence power grid, which can be divided into extreme weather and natural disaster, human-made malicious attacks, and social crisis. Then, resilience evaluation frameworks and quantification metrics are discussed. In addition, various existing resilience enhancement strategies, which are based on microgrids, active distribution networks, integrated and multi energy systems, distributed energy resources and flexible resources, cyber-physical systems, and some resilience enhancement methods, including probabilistic forecasting and analysis, artificial intelligence driven methods, and other cutting-edge technologies are summarized. Finally, this paper presents some further possible directions and developments for urban power grid resilience research, which focus on power-electronized urban distribution network, flexible distributed resource aggregation, cyber-physical-social systems, multi-energy systems, intelligent electrical transportation and artificial intelligence and Big Data technology

    The Strategic Supply Chain Management in the Digital Era, Tactical vs Strategic

    Get PDF
    The perspective of procurement and supply chain management is changing dramatically; traditionally, it was seen as a support function; however, the procurement function is receiving increased attention and investment as an essential contributor to the strategic success and a business enabler. While an end-to-end digital supply chain is an opportunity as it unleashes the next level of strategic growth and involves minimal investment in infrastructure, it is still a challenge to optimize and transform. Furthermore, the recent pandemics and geopolitical disruptions of Covid-19, the Ukraine-Russian war, Brexit and the US-China trade war; have structurally changed the global economy and revealed a new risk assessment that will result in the re-introduction of buffers, boundaries across industries and a partial return to regionalization with sort of de-globalization in which existing just-in-time getting replaced by just-in-case strategy

    Selected Papers from the 5th International Electronic Conference on Sensors and Applications

    Get PDF
    This Special Issue comprises selected papers from the proceedings of the 5th International Electronic Conference on Sensors and Applications, held on 15–30 November 2018, on sciforum.net, an online platform for hosting scholarly e-conferences and discussion groups. In this 5th edition of the electronic conference, contributors were invited to provide papers and presentations from the field of sensors and applications at large, resulting in a wide variety of excellent submissions and topic areas. Papers which attracted the most interest on the web or that provided a particularly innovative contribution were selected for publication in this collection. These peer-reviewed papers are published with the aim of rapid and wide dissemination of research results, developments, and applications. We hope this conference series will grow rapidly in the future and become recognized as a new way and venue by which to (electronically) present new developments related to the field of sensors and their applications

    Square dancing: official magazine of the Sets in Order American Square Dance Society.

    Get PDF
    Published monthly for and by Square Dancers and for the general enjoyment of all

    Warez

    Get PDF
    When most people think of piracy, they think of Bittorrent and The Pirate Bay. These public manifestations of piracy, though, conceal an elite worldwide, underground, organized network of pirate groups who specialize in obtaining media – music, videos, games, and software – before their official sale date and then racing against one another to release the material for free. Warez: The Infrastructure and Aesthetics of Piracy is the first scholarly research book about this underground subculture, which began life in the pre-internet era Bulletin Board Systems and moved to internet File Transfer Protocol servers (“topsites”) in the mid- to late-1990s. The “Scene,” as it is known, is highly illegal in almost every aspect of its operations. The term “Warez” itself refers to pirated media, a derivative of “software.” Taking a deep dive in the documentary evidence produced by the Scene itself, Warez describes the operations and infrastructures an underground culture with its own norms and rules of participation, its own forms of sociality, and its own artistic forms. Even though forms of digital piracy are often framed within ideological terms of equal access to knowledge and culture, Eve uncovers in the Warez Scene a culture of competitive ranking and one-upmanship that is at odds with the often communalist interpretations of piracy. Broad in scope and novel in its approach, Warez is indispensible reading for anyone interested in recent developments in digital culture, access to knowledge and culture, and the infrastructures that support our digital age
    corecore