297 research outputs found

    FedEdge AI-TC: A Semi-supervised Traffic Classification Method based on Trusted Federated Deep Learning for Mobile Edge Computing

    Full text link
    As a typical entity of MEC (Mobile Edge Computing), 5G CPE (Customer Premise Equipment)/HGU (Home Gateway Unit) has proven to be a promising alternative to traditional Smart Home Gateway. Network TC (Traffic Classification) is a vital service quality assurance and security management method for communication networks, which has become a crucial functional entity in 5G CPE/HGU. In recent years, many researchers have applied Machine Learning or Deep Learning (DL) to TC, namely AI-TC, to improve its performance. However, AI-TC faces challenges, including data dependency, resource-intensive traffic labeling, and user privacy concerns. The limited computing resources of 5G CPE further complicate efficient classification. Moreover, the "black box" nature of AI-TC models raises transparency and credibility issues. The paper proposes the FedEdge AI-TC framework, leveraging Federated Learning (FL) for reliable Network TC in 5G CPE. FL ensures privacy by employing local training, model parameter iteration, and centralized training. A semi-supervised TC algorithm based on Variational Auto-Encoder (VAE) and convolutional neural network (CNN) reduces data dependency while maintaining accuracy. To optimize model light-weight deployment, the paper introduces XAI-Pruning, an AI model compression method combined with DL model interpretability. Experimental evaluation demonstrates FedEdge AI-TC's superiority over benchmarks in terms of accuracy and efficient TC performance. The framework enhances user privacy and model credibility, offering a comprehensive solution for dependable and transparent Network TC in 5G CPE, thus enhancing service quality and security.Comment: 13 pages, 13 figure

    Deep Learning for Network Traffic Monitoring and Analysis (NTMA): A Survey

    Get PDF
    Modern communication systems and networks, e.g., Internet of Things (IoT) and cellular networks, generate a massive and heterogeneous amount of traffic data. In such networks, the traditional network management techniques for monitoring and data analytics face some challenges and issues, e.g., accuracy, and effective processing of big data in a real-time fashion. Moreover, the pattern of network traffic, especially in cellular networks, shows very complex behavior because of various factors, such as device mobility and network heterogeneity. Deep learning has been efficiently employed to facilitate analytics and knowledge discovery in big data systems to recognize hidden and complex patterns. Motivated by these successes, researchers in the field of networking apply deep learning models for Network Traffic Monitoring and Analysis (NTMA) applications, e.g., traffic classification and prediction. This paper provides a comprehensive review on applications of deep learning in NTMA. We first provide fundamental background relevant to our review. Then, we give an insight into the confluence of deep learning and NTMA, and review deep learning techniques proposed for NTMA applications. Finally, we discuss key challenges, open issues, and future research directions for using deep learning in NTMA applications.publishedVersio

    IoT-based Secure Data Transmission Prediction using Deep Learning Model in Cloud Computing

    Get PDF
    The security of Internet of Things (IoT) networks has become highly significant due to the growing number of IoT devices and the rise in data transfer across cloud networks. Here, we propose Generative Adversarial Networks (GANs) method for predicting secure data transmission in IoT-based systems using cloud computing. We evaluated our model’s attainment on the UNSW-NB15 dataset and contrasted it with other machine-learning (ML) methods, comprising decision trees (DT), random forests, and support vector machines (SVM). The outcomes demonstrate that our suggested GANs model performed better than expected in terms of precision, recall, F1 score, and area under the receiver operating characteristic curve (AUC-ROC). The GANs model generates a 98.07% accuracy rate for the testing dataset with a precision score of 98.45%, a recall score of 98.19%, an F1 score of 98.32%, and an AUC-ROC value of 0.998. These outcomes show how well our suggested GANs model predicts secure data transmission in cloud-based IoT-based systems, which is a crucial step in guaranteeing the confidentiality of IoT networks
    • …
    corecore