807 research outputs found

    A survey of machine learning techniques applied to self organizing cellular networks

    Get PDF
    In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future

    AI-native Interconnect Framework for Integration of Large Language Model Technologies in 6G Systems

    Full text link
    The evolution towards 6G architecture promises a transformative shift in communication networks, with artificial intelligence (AI) playing a pivotal role. This paper delves deep into the seamless integration of Large Language Models (LLMs) and Generalized Pretrained Transformers (GPT) within 6G systems. Their ability to grasp intent, strategize, and execute intricate commands will be pivotal in redefining network functionalities and interactions. Central to this is the AI Interconnect framework, intricately woven to facilitate AI-centric operations within the network. Building on the continuously evolving current state-of-the-art, we present a new architectural perspective for the upcoming generation of mobile networks. Here, LLMs and GPTs will collaboratively take center stage alongside traditional pre-generative AI and machine learning (ML) algorithms. This union promises a novel confluence of the old and new, melding tried-and-tested methods with transformative AI technologies. Along with providing a conceptual overview of this evolution, we delve into the nuances of practical applications arising from such an integration. Through this paper, we envisage a symbiotic integration where AI becomes the cornerstone of the next-generation communication paradigm, offering insights into the structural and functional facets of an AI-native 6G network

    A Survey and Future Directions on Clustering: From WSNs to IoT and Modern Networking Paradigms

    Get PDF
    Many Internet of Things (IoT) networks are created as an overlay over traditional ad-hoc networks such as Zigbee. Moreover, IoT networks can resemble ad-hoc networks over networks that support device-to-device (D2D) communication, e.g., D2D-enabled cellular networks and WiFi-Direct. In these ad-hoc types of IoT networks, efficient topology management is a crucial requirement, and in particular in massive scale deployments. Traditionally, clustering has been recognized as a common approach for topology management in ad-hoc networks, e.g., in Wireless Sensor Networks (WSNs). Topology management in WSNs and ad-hoc IoT networks has many design commonalities as both need to transfer data to the destination hop by hop. Thus, WSN clustering techniques can presumably be applied for topology management in ad-hoc IoT networks. This requires a comprehensive study on WSN clustering techniques and investigating their applicability to ad-hoc IoT networks. In this article, we conduct a survey of this field based on the objectives for clustering, such as reducing energy consumption and load balancing, as well as the network properties relevant for efficient clustering in IoT, such as network heterogeneity and mobility. Beyond that, we investigate the advantages and challenges of clustering when IoT is integrated with modern computing and communication technologies such as Blockchain, Fog/Edge computing, and 5G. This survey provides useful insights into research on IoT clustering, allows broader understanding of its design challenges for IoT networks, and sheds light on its future applications in modern technologies integrated with IoT.acceptedVersio

    Recent Trends in Communication Networks

    Get PDF
    In recent years there has been many developments in communication technology. This has greatly enhanced the computing power of small handheld resource-constrained mobile devices. Different generations of communication technology have evolved. This had led to new research for communication of large volumes of data in different transmission media and the design of different communication protocols. Another direction of research concerns the secure and error-free communication between the sender and receiver despite the risk of the presence of an eavesdropper. For the communication requirement of a huge amount of multimedia streaming data, a lot of research has been carried out in the design of proper overlay networks. The book addresses new research techniques that have evolved to handle these challenges

    Lifetime centric load balancing mechanism in wireless sensor network based IoT environment

    Get PDF
    Wireless sensor network (WSN) is a vital form of the underlying technology of the internet of things (IoT); WSN comprises several energy-constrained sensor nodes to monitor various physical parameters. Moreover, due to the energy constraint, load balancing plays a vital role considering the wireless sensor network as battery power. Although several clustering algorithms have been proposed for providing energy efficiency, there are chances of uneven load balancing and this causes the reduction in network lifetime as there exists inequality within the network. These scenarios occur due to the short lifetime of the cluster head. These cluster head (CH) are prime responsible for all the activity as it is also responsible for intra-cluster and inter-cluster communications. In this research work, a mechanism named lifetime centric load balancing mechanism (LCLBM) is developed that focuses on CH-selection, network design, and optimal CH distribution. Furthermore, under LCLBM, assistant cluster head (ACH) for balancing the load is developed. LCLBM is evaluated by considering the important metrics, such as energy consumption, communication overhead, number of failed nodes, and one-way delay. Further, evaluation is carried out by comparing with ES-Leach method, through the comparative analysis it is observed that the proposed model outperforms the existing model

    A Review of Cellular Networks: Applications, Benefits and Limitations

    Get PDF
    Over decades the world has witnessed stepwise evolution in Cellular networks technology and mobile network industry which have transformed nation’s economy and created job opportunities since 1970. The stepwise evolution of the cellular networks from first generation (1G) to fifth generation (5G) have shown tremendous increase in technology, benefits, user demand and applications.  As new generation of cellular network unfold, the challenges and limitations of preceded generations are being tackled as always depicted in the design architecture of each new generation. The first generation (1G) cellular network was based on analogue and was able to cater for mobile voice transmission but posed some challenges in terms of quality of service and security of network. Second generation (2G) came with the introduction of digitally encrypted technology and greater security for sender and receiver with services such as text messages and MMS. Third generation (3G) was developed to offer high speed data and multimedia connections to subscribers.  Fourth generation evolves from 3G with higher data rate, lower latency, greater spectral efficiency and simple protocol architecture with efficient multicast than its predecessors.  Fifth-generation (5G) networks  is being deployed to meet growing demands for data from consumer and industrial users  and  to enable the use of advanced technologies  such as smart city applications, autonomous vehicles and navigation. The envisioned sixth generation (6G) of cellular network is expected to witness an unparalleled revolution that would significantly distinguish it from the existing generations and will drastically re-shape the wireless evolution from "connected thing to connected intelligence. This paper provides a comprehensive review of cellular networks applications and challenges from 1G to 6G. Keywords: 1G, 2G, 3G, 4G, 5G, 6G, Applications  Benefits, and Limitations DOI: 10.7176/NCS/11-04 Publication date: December 31st 202

    Edge Learning for 6G-enabled Internet of Things: A Comprehensive Survey of Vulnerabilities, Datasets, and Defenses

    Full text link
    The ongoing deployment of the fifth generation (5G) wireless networks constantly reveals limitations concerning its original concept as a key driver of Internet of Everything (IoE) applications. These 5G challenges are behind worldwide efforts to enable future networks, such as sixth generation (6G) networks, to efficiently support sophisticated applications ranging from autonomous driving capabilities to the Metaverse. Edge learning is a new and powerful approach to training models across distributed clients while protecting the privacy of their data. This approach is expected to be embedded within future network infrastructures, including 6G, to solve challenging problems such as resource management and behavior prediction. This survey article provides a holistic review of the most recent research focused on edge learning vulnerabilities and defenses for 6G-enabled IoT. We summarize the existing surveys on machine learning for 6G IoT security and machine learning-associated threats in three different learning modes: centralized, federated, and distributed. Then, we provide an overview of enabling emerging technologies for 6G IoT intelligence. Moreover, we provide a holistic survey of existing research on attacks against machine learning and classify threat models into eight categories, including backdoor attacks, adversarial examples, combined attacks, poisoning attacks, Sybil attacks, byzantine attacks, inference attacks, and dropping attacks. In addition, we provide a comprehensive and detailed taxonomy and a side-by-side comparison of the state-of-the-art defense methods against edge learning vulnerabilities. Finally, as new attacks and defense technologies are realized, new research and future overall prospects for 6G-enabled IoT are discussed
    corecore