72 research outputs found

    Mining Historical Data towards Interference Management in Wireless SDNs

    Get PDF
    WiFi networks are often planned to reduce interference through planning, macroscopic self-organization (e.g. channel switching) or network management. In this paper, we explore the use of historical data to automatically predict traffic bottlenecks and make rapid decisions in a wireless (WiFi-like) network on a smaller scale. This is now possible with software defined networks (SDN), whose controllers can have a global view of traffic flows in a network. Models such as classification trees can be used to quickly make decisions on how to manage network resources based on the quality needs, service level agreement or other criteria provided by a network administrator. The objective of this paper is to use data generated by simulation tools to see if such classification models can be developed and to evaluate their efficacy. For this purpose, extensive simulation data were collected and data mining techniques were then used to develop QoS prediction trees. Such trees can predict the maximum delay that results due to specific traffic situations with specific parameters. We evaluated these decision/classification trees by placing them in an SDN controller. OpenFlow cannot directly provide the necessary information for managing wireless networks so we used POX messenger to set up an agent on each AP for adjusting the network. Finally we explored the possibility of updating the tree using feedback that the controller receives from hosts. Our results show that such trees are effective and can be used to manage the network and decrease maximum packet delay

    A Survey of Machine Learning Techniques for Video Quality Prediction from Quality of Delivery Metrics

    Get PDF
    A growing number of video streaming networks are incorporating machine learning (ML) applications. The growth of video streaming services places enormous pressure on network and video content providers who need to proactively maintain high levels of video quality. ML has been applied to predict the quality of video streams. Quality of delivery (QoD) measurements, which capture the end-to-end performances of network services, have been leveraged in video quality prediction. The drive for end-to-end encryption, for privacy and digital rights management, has brought about a lack of visibility for operators who desire insights from video quality metrics. In response, numerous solutions have been proposed to tackle the challenge of video quality prediction from QoD-derived metrics. This survey provides a review of studies that focus on ML techniques for predicting the QoD metrics in video streaming services. In the context of video quality measurements, we focus on QoD metrics, which are not tied to a particular type of video streaming service. Unlike previous reviews in the area, this contribution considers papers published between 2016 and 2021. Approaches for predicting QoD for video are grouped under the following headings: (1) video quality prediction under QoD impairments, (2) prediction of video quality from encrypted video streaming traffic, (3) predicting the video quality in HAS applications, (4) predicting the video quality in SDN applications, (5) predicting the video quality in wireless settings, and (6) predicting the video quality in WebRTC applications. Throughout the survey, some research challenges and directions in this area are discussed, including (1) machine learning over deep learning; (2) adaptive deep learning for improved video delivery; (3) computational cost and interpretability; (4) self-healing networks and failure recovery. The survey findings reveal that traditional ML algorithms are the most widely adopted models for solving video quality prediction problems. This family of algorithms has a lot of potential because they are well understood, easy to deploy, and have lower computational requirements than deep learning techniques

    A Cognitive Routing framework for Self-Organised Knowledge Defined Networks

    Get PDF
    This study investigates the applicability of machine learning methods to the routing protocols for achieving rapid convergence in self-organized knowledge-defined networks. The research explores the constituents of the Self-Organized Networking (SON) paradigm for 5G and beyond, aiming to design a routing protocol that complies with the SON requirements. Further, it also exploits a contemporary discipline called Knowledge-Defined Networking (KDN) to extend the routing capability by calculating the “Most Reliable” path than the shortest one. The research identifies the potential key areas and possible techniques to meet the objectives by surveying the state-of-the-art of the relevant fields, such as QoS aware routing, Hybrid SDN architectures, intelligent routing models, and service migration techniques. The design phase focuses primarily on the mathematical modelling of the routing problem and approaches the solution by optimizing at the structural level. The work contributes Stochastic Temporal Edge Normalization (STEN) technique which fuses link and node utilization for cost calculation; MRoute, a hybrid routing algorithm for SDN that leverages STEN to provide constant-time convergence; Most Reliable Route First (MRRF) that uses a Recurrent Neural Network (RNN) to approximate route-reliability as the metric of MRRF. Additionally, the research outcomes include a cross-platform SDN Integration framework (SDN-SIM) and a secure migration technique for containerized services in a Multi-access Edge Computing environment using Distributed Ledger Technology. The research work now eyes the development of 6G standards and its compliance with Industry-5.0 for enhancing the abilities of the present outcomes in the light of Deep Reinforcement Learning and Quantum Computing

    Internet of Underwater Things and Big Marine Data Analytics -- A Comprehensive Survey

    Full text link
    The Internet of Underwater Things (IoUT) is an emerging communication ecosystem developed for connecting underwater objects in maritime and underwater environments. The IoUT technology is intricately linked with intelligent boats and ships, smart shores and oceans, automatic marine transportations, positioning and navigation, underwater exploration, disaster prediction and prevention, as well as with intelligent monitoring and security. The IoUT has an influence at various scales ranging from a small scientific observatory, to a midsized harbor, and to covering global oceanic trade. The network architecture of IoUT is intrinsically heterogeneous and should be sufficiently resilient to operate in harsh environments. This creates major challenges in terms of underwater communications, whilst relying on limited energy resources. Additionally, the volume, velocity, and variety of data produced by sensors, hydrophones, and cameras in IoUT is enormous, giving rise to the concept of Big Marine Data (BMD), which has its own processing challenges. Hence, conventional data processing techniques will falter, and bespoke Machine Learning (ML) solutions have to be employed for automatically learning the specific BMD behavior and features facilitating knowledge extraction and decision support. The motivation of this paper is to comprehensively survey the IoUT, BMD, and their synthesis. It also aims for exploring the nexus of BMD with ML. We set out from underwater data collection and then discuss the family of IoUT data communication techniques with an emphasis on the state-of-the-art research challenges. We then review the suite of ML solutions suitable for BMD handling and analytics. We treat the subject deductively from an educational perspective, critically appraising the material surveyed.Comment: 54 pages, 11 figures, 19 tables, IEEE Communications Surveys & Tutorials, peer-reviewed academic journa

    Modeling the Abnormality: Machine Learning-based Anomaly and Intrusion Detection in Software-defined Networks

    Get PDF
    Modern software-defined networks (SDN) provide additional control and optimal functionality over large-scale computer networks. Due to the rise in networking applications, cyber attacks have also increased progressively. Modern cyber attacks wreak havoc on large-scale SDNs, many of which are part of critical national infrastructures. Artifacts of these attacks may present as network anomalies within the core network or edge anomalies in the SDN edge. As protection, intrusion and anomaly detection must be implemented in both the edge and core. In this dissertation, we investigate and create novel network intrusion and anomaly detection techniques that can handle the next generation of network attacks. We collect and use new network metrics and statistics to perform network intrusion detection. We demonstrated that machine learning models like Random Forest classifiers effectively use network port statistics to differentiate between normal and attack traffic with up to 98% accuracy. These collected metrics are augmented to create a new open-sourced dataset that improves upon class imbalance. The developed dataset outperforms other contemporary datasets with an FÎŒ score of 94% and a minimum F score of 86%. We also propose SDN intrusion detection approaches that provide high confidence scores and explainability to provide additional insights and be implemented in a real-time environment. Through this, we observed that network byte and packet transmissions and their robust statistics can be significant indicators for the prevalence of any attack. Additionally, we propose an anomaly detection technique for time-series SDN edge devices. We observe precision and recall scores inversely correlate as Δ increases, and Δ = 6.0 yielded the best F score. Results also highlight that the best performance was achieved from data that had been moderately smoothed (0.8 ≀ α ≀ 0.4), compared to intensely smoothed or non-smoothed data. In addition, we investigated and analyzed the impact that adversarial attacks can have on machine learning-based network intrusion detection systems for SDN. Results show that the proposed attacks provide substantial deterioration of classifier performance in single SDNs, and some classifiers deteriorate up to ≈60. Finally, we proposed an adversarial attack detection framework for multi-controller SDN setups that uses inherent network architecture features to make decisions. Results indicate efficient detection performance achieved by the framework in determining and localizing the presence of adversarial attacks. However, the performance begins to deteriorate when more than 30% of the SDN controllers have become compromised. The work performed in this dissertation has provided multiple contributions to the network security research community like providing equitable open-sourced SDN datasets, promoting the usage of core network statistics for intrusion detection, proposing robust anomaly detection techniques for time-series data, and analyzing how adversarial attacks can compromise the machine learning algorithms that protect our SDNs. The results of this dissertation can catalyze future developments in network security

    Machine Learning Meets Communication Networks: Current Trends and Future Challenges

    Get PDF
    The growing network density and unprecedented increase in network traffic, caused by the massively expanding number of connected devices and online services, require intelligent network operations. Machine Learning (ML) has been applied in this regard in different types of networks and networking technologies to meet the requirements of future communicating devices and services. In this article, we provide a detailed account of current research on the application of ML in communication networks and shed light on future research challenges. Research on the application of ML in communication networks is described in: i) the three layers, i.e., physical, access, and network layers; and ii) novel computing and networking concepts such as Multi-access Edge Computing (MEC), Software Defined Networking (SDN), Network Functions Virtualization (NFV), and a brief overview of ML-based network security. Important future research challenges are identified and presented to help stir further research in key areas in this direction

    JTIT

    Get PDF
    kwartalni

    Review and analysis of networking challenges in cloud computing

    Get PDF
    Cloud Computing offers virtualized computing, storage, and networking resources, over the Internet, to organizations and individual users in a completely dynamic way. These cloud resources are cheaper, easier to manage, and more elastic than sets of local, physical, ones. This encourages customers to outsource their applications and services to the cloud. The migration of both data and applications outside the administrative domain of customers into a shared environment imposes transversal, functional problems across distinct platforms and technologies. This article provides a contemporary discussion of the most relevant functional problems associated with the current evolution of Cloud Computing, mainly from the network perspective. The paper also gives a concise description of Cloud Computing concepts and technologies. It starts with a brief history about cloud computing, tracing its roots. Then, architectural models of cloud services are described, and the most relevant products for Cloud Computing are briefly discussed along with a comprehensive literature review. The paper highlights and analyzes the most pertinent and practical network issues of relevance to the provision of high-assurance cloud services through the Internet, including security. Finally, trends and future research directions are also presented
    • 

    corecore