1,580 research outputs found

    Datanet: Deep Learning Based Encrypted Network Traffic Classification in SDN Home Gateway

    Get PDF
    A smart home network will support various smart devices and applications, e.g., home automation devices, E-health devices, regular computing devices, and so on. Most devices in a smart home access the Internet through a home gateway (HGW). In this paper, we propose a software-defined network (SDN)-HGW framework to better manage distributed smart home networks and support the SDN controller of the core network. The SDN controller enables efficient network quality-of-service management based on real-time traffic monitoring and resource allocation of the core network. However, it cannot provide network management in distributed smart homes. Our proposed SDN-HGW extends the control to the access network, i.e., a smart home network, for better end-to-end network management. Specifically, the proposed SDN-HGW can achieve distributed application awareness by classifying data traffic in a smart home network. Most existing traffic classification solutions, e.g., deep packet inspection, cannot provide real-time application awareness for encrypted data traffic. To tackle those issues, we develop encrypted data classifiers (denoted as DataNets) based on three deep learning schemes, i.e., multilayer perceptron, stacked autoencoder, and convolutional neural networks, using an open data set that has over 200 000 encrypted data samples from 15 applications. A data preprocessing scheme is proposed to process raw data packets and the tested data set so that DataNet can be created. The experimental results show that the developed DataNets can be applied to enable distributed application-aware SDN-HGW in future smart home networks

    Cognition-Based Networks: A New Perspective on Network Optimization Using Learning and Distributed Intelligence

    Get PDF
    IEEE Access Volume 3, 2015, Article number 7217798, Pages 1512-1530 Open Access Cognition-based networks: A new perspective on network optimization using learning and distributed intelligence (Article) Zorzi, M.a , Zanella, A.a, Testolin, A.b, De Filippo De Grazia, M.b, Zorzi, M.bc a Department of Information Engineering, University of Padua, Padua, Italy b Department of General Psychology, University of Padua, Padua, Italy c IRCCS San Camillo Foundation, Venice-Lido, Italy View additional affiliations View references (107) Abstract In response to the new challenges in the design and operation of communication networks, and taking inspiration from how living beings deal with complexity and scalability, in this paper we introduce an innovative system concept called COgnition-BAsed NETworkS (COBANETS). The proposed approach develops around the systematic application of advanced machine learning techniques and, in particular, unsupervised deep learning and probabilistic generative models for system-wide learning, modeling, optimization, and data representation. Moreover, in COBANETS, we propose to combine this learning architecture with the emerging network virtualization paradigms, which make it possible to actuate automatic optimization and reconfiguration strategies at the system level, thus fully unleashing the potential of the learning approach. Compared with the past and current research efforts in this area, the technical approach outlined in this paper is deeply interdisciplinary and more comprehensive, calling for the synergic combination of expertise of computer scientists, communications and networking engineers, and cognitive scientists, with the ultimate aim of breaking new ground through a profound rethinking of how the modern understanding of cognition can be used in the management and optimization of telecommunication network

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Intelligent Management and Efficient Operation of Big Data

    Get PDF
    This chapter details how Big Data can be used and implemented in networking and computing infrastructures. Specifically, it addresses three main aspects: the timely extraction of relevant knowledge from heterogeneous, and very often unstructured large data sources, the enhancement on the performance of processing and networking (cloud) infrastructures that are the most important foundational pillars of Big Data applications or services, and novel ways to efficiently manage network infrastructures with high-level composed policies for supporting the transmission of large amounts of data with distinct requisites (video vs. non-video). A case study involving an intelligent management solution to route data traffic with diverse requirements in a wide area Internet Exchange Point is presented, discussed in the context of Big Data, and evaluated.Comment: In book Handbook of Research on Trends and Future Directions in Big Data and Web Intelligence, IGI Global, 201

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    DDOS ATTACK DETECTION USING HYBRID (CCN AND LSTM) ML MODEL

    Get PDF
    LSTM (Long Short-Term Memory) and CNN (Convolutional Neural Networks) are two types of deep learning algorithms; by combining the strengths of LSTM and CNN, researchers have developed deep learning models that can effectively detect SDN (Software-Defined Network) attacks including Distributed Denial of Service. These models effectively analyze network traffic, encompassing temporal and spatial characteristics, resulting in precise identification of malicious traffic.In this research, a hybrid model composed of CNN and LSTM is used to detect the DDoS attack in SDN network. Where the CNN component of the model can identify spatial patterns in network traffic, such as the characteristics of individual packets, while the LSTM component can capture temporal patterns in traffic over time, such as the timing and frequency of traffic bursts. The proposed model has been trained on a labeled network traffic dataset, with one class representing normal traffic and another class representing DDoS attack traffic. During the training process, the model adjusts its weights and biases to minimize the difference between its predicted output and the actual output for each input sample. Once trained, the hybrid model classifies incoming network traffic in the dataset as either normal or malicious with an initial accuracy of (78.18%) and losses of (39.77%) at the 1st epoch till it reaches an accuracy of (99.99%) with losses of (9.29×10-5) at the epoch number 500. It should be mentioned that the hybrid model of CNN and LSTM for DDoS detection is implemented using Python Anaconda platform with an ETA 28ms/step

    Wireless Communications in the Era of Big Data

    Full text link
    The rapidly growing wave of wireless data service is pushing against the boundary of our communication network's processing power. The pervasive and exponentially increasing data traffic present imminent challenges to all the aspects of the wireless system design, such as spectrum efficiency, computing capabilities and fronthaul/backhaul link capacity. In this article, we discuss the challenges and opportunities in the design of scalable wireless systems to embrace such a "bigdata" era. On one hand, we review the state-of-the-art networking architectures and signal processing techniques adaptable for managing the bigdata traffic in wireless networks. On the other hand, instead of viewing mobile bigdata as a unwanted burden, we introduce methods to capitalize from the vast data traffic, for building a bigdata-aware wireless network with better wireless service quality and new mobile applications. We highlight several promising future research directions for wireless communications in the mobile bigdata era.Comment: This article is accepted and to appear in IEEE Communications Magazin
    • …
    corecore