38 research outputs found

    Special issue on entropy-based applied cryptography and enhanced security for ubiquitous computing

    Full text link
    Entropy is a basic and important concept in information theory. It is also often used as a measure of the unpredictability of a cryptographic key in cryptography research areas. Ubiquitous computing (Ubi-comp) has emerged rapidly as an exciting new paradigm. In this special issue, we mainly selected and discussed papers related with ore theories based on the graph theory to solve computational problems on cryptography and security, practical technologies; applications and services for Ubi-comp including secure encryption techniques, identity and authentication; credential cloning attacks and countermeasures; switching generator with resistance against the algebraic and side channel attacks; entropy-based network anomaly detection; applied cryptography using chaos function, information hiding and watermark, secret sharing, message authentication, detection and modeling of cyber attacks with Petri Nets, and quantum flows for secret key distribution, etc

    On the Optimality of Virtualized Security Function Placement in Multi-Tenant Data Centers

    Get PDF
    Security and service protection against cyber attacks remain among the primary challenges for virtualized, multi-tenant Data Centres (DCs), for reasons that vary from lack of resource isolation to the monolithic nature of legacy middleboxes. Although security is currently considered a property of the underlying infrastructure, diverse services require protection against different threats and at timescales which are on par with those of service deployment and elastic resource provisioning. We address the resource allocation problem of deploying customised security services over a virtualized, multi-tenant DC. We formulate the problem in Integral Linear Programming (ILP) as an instance of the NP-hard variable size variable cost bin packing problem with the objective of maximising the residual resources after allocation. We propose a modified version of the Best Fit Decreasing algorithm (BFD) to solve the problem in polynomial time and we show that BFD optimises the objective function up to 80% more than other algorithms

    Preface

    Get PDF

    A Generalized Renyi Joint Entropy Method for the Detection of DDoS Attacks in IoT

    Get PDF
    Internet of things connects all the smart devices with internet and gain more information in comparison with other systems. Since different types of objects are connected, privacy and security of the users must be ensured. Because of the decentralised nature, IoT is prone to different types of attacks which are either active or passive. Since internet is the main part of IoT, the security issues present in Internet will be available in the Internet of Things too. Distributed denial of service is a major threat of this type and a critical threat. It reduces the performance of the complete network even it breaks entire communication. For this reason many researches have been made in this area to detect Distributed Denial of Service attack. Entropy-based approaches to identify DDoS attacks in the internet of things are discussed in this research. This new approach is based on the GRJE method, which stands for generalised Renyi joint entropy. Renyi joint entropy is used in the suggested approach to analyse network traffic flow. The suggested method is put into practise and evaluated against other methods based on a few factors.  Results from an analysis of the suggested system's effectiveness in NS2 are reported in this study

    ВИЯВЛЕННЯ АНОМАЛІЙ В ТЕЛЕКОМУНІКАЦІЙНОМУ ТРАФІКУ СТАТИСТИЧНИМИ МЕТОДАМИ

    Get PDF
    Anomaly detection is an important task in many areas of human life. Many statistical methods are used to detect anomalies. In this paper, statistical methods of data analysis, such as survival analysis, time series analysis (fractal), classification method (decision trees), cluster analysis, entropy method were chosen to detect anomalies. A description of the selected methods is given. To analyze anomalies, the traffic and attack implementations from an open dataset were taken. More than 3 million packets from the dataset were used to analyze the described methods. The dataset contained legitimate traffic (75%) and attacks (25%). Simulation modeling of the selected statistical methods was performed on the example of network traffic implementations of telecommunication networks of different protocols. To implement the simulation, programs were written in the Pyton programming language. DDoS attacks, UDP-flood, TCP SYN, ARP attacks and HTTP-flood were chosen as anomalies. A comparative analysis of the performance of these methods to detect anomalies (attacks) on such parameters as the probability of anomaly detection, the probability of false positive detection, the running time of each method to detect the anomaly was carried out. Experimental results showed the performance of each method. The decision tree method is the best in terms of anomaly identification probability, fewer false positives, and anomaly detection time.  The entropy analysis method is slightly slower and gives slightly more false positives. Next is the cluster analysis method, which is slightly worse at detecting anomalies. Then the fractal analysis method showed a lower probability of detecting anomalies, a higher probability of false positives and a longer running time. The worst was the survival analysis method.Виявлення аномалій є важливим завданням у багатьох сферах людського життя. Для виявлення аномалій використовується множина статистичних методів. У даній роботі для виявлення аномалій були обрані статистичні методи аналізу даних, такі як аналіз виживання, аналіз часових рядів (фрактальний), метод класифікації (дерева прийняття рішень), кластерний аналіз, ентропійний метод. Також наводиться опис вибраних методів. Для аналізу аномалій були взяті реалізації трафіків і атак з відкритого датасету. Для аналізу описаних методів було використано понад 3 млн. пакетів з набору даних. Датасет містив легітимний трафік (75%) і атаки (25%). Проведено імітаційне моделювання обраних статистичних методів на прикладі реалізацій мережного трафіку телекомунікаційних мереж різних протоколів. Для реалізації імітаційного моделювання були написані програми на мові програмування Pyton. Як аномалії були обрані DDoS-атаки, UDP-flood, TCP SYN, ARP-атаки і HTTP-flood. Був проведений порівняльний аналіз продуктивності обраних статистичних методів щодо виявлення аномалій (атак) за такими параметрами як ймовірність виявлення аномалій, ймовірність хибнопозитивного виявлення, час роботи кожного методу для виявлення аномалії. Результати експериментів показали працездатність кожного методу. Метод дерева рішень є найкращим за ймовірністю ідентифікації аномалій, меншій кількості хибнопозитивних спрацьовувань і часу виявлення аномалій. Метод ентропійного аналізу дещо повільніше і дає трохи більше помилкових спрацьовувань. Далі слідує метод кластерного аналізу, який дещо гірше виявляє аномалії. Тоді як метод фрактального аналізу показав меншу ймовірність виявлення аномалій, велику ймовірність помилкових спрацьовувань і більший час роботи. Найгіршим виявився метод аналізу виживання

    In-Network Placement of Security VNFs in Multi-Tenant Data Centers

    Get PDF
    Middleboxes are typically hardware-accelerated appliances such as firewalls, Proxies, WAN optimizers, and NATs that play an important role in service provisioning over today’s Data Centers. We focus on the placement of virtualised security services in multi-tenant Data Centers. Customised security services are provided to tenants as software VNF modules collocated with switches in the network. Our placement formulation satisfies the allocation constraints while maintaining efficient management of the infrastructure resources. We propose a Constraint Programming (CP) formulation and a CPLEX implementation. We also formulate a heuristic-based algorithm to solve larger instances of the placement problem. Extensive evaluation of the algorithms has been conducted, demonstrating that the VNF approach provides more than 50% reduction in resource consumption compared to other heuristic algorithms
    corecore