213 research outputs found

    Large Language Models for Telecom: The Next Big Thing?

    Full text link
    The evolution of generative artificial intelligence (GenAI) constitutes a turning point in reshaping the future of technology in different aspects. Wireless networks in particular, with the blooming of self-evolving networks, represent a rich field for exploiting GenAI and reaping several benefits that can fundamentally change the way how wireless networks are designed and operated nowadays. To be specific, large language models (LLMs), a subfield of GenAI, are envisioned to open up a new era of autonomous wireless networks, in which a multimodal large model trained over various Telecom data, can be fine-tuned to perform several downstream tasks, eliminating the need for dedicated AI models for each task and paving the way for the realization of artificial general intelligence (AGI)-empowered wireless networks. In this article, we aim to unfold the opportunities that can be reaped from integrating LLMs into the Telecom domain. In particular, we aim to put a forward-looking vision on a new realm of possibilities and applications of LLMs in future wireless networks, defining directions for designing, training, testing, and deploying Telecom LLMs, and reveal insights on the associated theoretical and practical challenges

    Experimental Analysis of A-RoF Based Optical Communication System for 6G O-RAN Downlink

    Get PDF
    This paper explores recent advancements in optical communication for sixth generation (6G) networks, focusing on the proposed architecture, Open Radio Access Network (O-RAN) specifications, and Radio over Fiber (RoF) systems. Experimental evaluation of 6G Analog RoF, utilizing 60 GHz and 28 GHz carriers over 10 km single mode fiber, demonstrates the efficacy of Digital Pre-Distortion (DPD) linearization in reducing Error Vector Magnitude (EVM). Despite the observed rise in EVM with increased bandwidth, slight performance improvements are facilitated by DPD. This underscores the significance of ongoing advancements in mitigating challenges and harnessing the full potential of 6G Analog RoF (A-RoF) technology for upcoming O-RAN. These developments are poised to transform communication networks, ensuring enhanced speed, reliability, and efficiency to meet the dynamic demands of the digital landscape in the upcoming 6G era and beyond.</p

    Mapping Cloud-Edge-IoT opportunities and challenges in Europe

    Get PDF
    While current data processing predominantly occurs in centralized facilities, with a minor portion handled by smart objects, a shift is anticipated, with a surge in data originating from smart devices. This evolution necessitates reconfiguring the infrastructure, emphasising computing capabilities at the cloud's "edge" closer to data sources. This change symbolises the merging of cloud, edge, and IoT technologies into a unified network infrastructure - a Computing Continuum - poised to redefine tech interactions, offering novel prospects across diverse sectors. The computing continuum is emerging as a cornerstone of tech advancement in the contemporary digital era. This paper provides an in-depth exploration of the computing continuum, highlighting its potential, practical implications, and the adjustments required to tackle existing challenges. It emphasises the continuum's real-world applications, market trends, and its significance in shaping Europe's tech future

    Generative AI-driven Semantic Communication Networks: Architecture, Technologies and Applications

    Full text link
    Generative artificial intelligence (GAI) has emerged as a rapidly burgeoning field demonstrating significant potential in creating diverse contents intelligently and automatically. To support such artificial intelligence-generated content (AIGC) services, future communication systems should fulfill much more stringent requirements (including data rate, throughput, latency, etc.) with limited yet precious spectrum resources. To tackle this challenge, semantic communication (SemCom), dramatically reducing resource consumption via extracting and transmitting semantics, has been deemed as a revolutionary communication scheme. The advanced GAI algorithms facilitate SemCom on sophisticated intelligence for model training, knowledge base construction and channel adaption. Furthermore, GAI algorithms also play an important role in the management of SemCom networks. In this survey, we first overview the basics of GAI and SemCom as well as the synergies of the two technologies. Especially, the GAI-driven SemCom framework is presented, where many GAI models for information creation, SemCom-enabled information transmission and information effectiveness for AIGC are discussed separately. We then delve into the GAI-driven SemCom network management involving with novel management layers, knowledge management, and resource allocation. Finally, we envision several promising use cases, i.e., autonomous driving, smart city, and the Metaverse for a more comprehensive exploration

    AI-native Interconnect Framework for Integration of Large Language Model Technologies in 6G Systems

    Full text link
    The evolution towards 6G architecture promises a transformative shift in communication networks, with artificial intelligence (AI) playing a pivotal role. This paper delves deep into the seamless integration of Large Language Models (LLMs) and Generalized Pretrained Transformers (GPT) within 6G systems. Their ability to grasp intent, strategize, and execute intricate commands will be pivotal in redefining network functionalities and interactions. Central to this is the AI Interconnect framework, intricately woven to facilitate AI-centric operations within the network. Building on the continuously evolving current state-of-the-art, we present a new architectural perspective for the upcoming generation of mobile networks. Here, LLMs and GPTs will collaboratively take center stage alongside traditional pre-generative AI and machine learning (ML) algorithms. This union promises a novel confluence of the old and new, melding tried-and-tested methods with transformative AI technologies. Along with providing a conceptual overview of this evolution, we delve into the nuances of practical applications arising from such an integration. Through this paper, we envisage a symbiotic integration where AI becomes the cornerstone of the next-generation communication paradigm, offering insights into the structural and functional facets of an AI-native 6G network

    The 6G Architecture Landscape:European Perspective

    Get PDF

    The Role of Artificial Intelligence in Next-Generation Wireless Networks - an Overview of Technological and Law Implications

    Get PDF
    openThis thesis explores how artificial intelligence (AI) will benefit next-generation wireless networks focusing both on the technological aspects and legal implications. At first, a summary of AI history is provided, together with an overview of the different AI methodologies. Among them, the thesis focuses on machine learning-based approaches and in particular neural network algorithms for wireless networks. The second chapter examines how AI can support wireless network management and explores the advantages of adopting this new paradigm as a substitute for current non-data-driven approaches. Next, we discuss the technical challenges that should be addressed for the practical integration of AI within wireless networks, starting from the huge amount of data needed to properly configure AI methodologies and the high computational demand. Emerging approaches that will allow overcoming the above-mentioned challenges, such as, e.g., the placement of computational servers near the base stations and the adoption of federated learning techniques, are also discussed. In the third chapter, we examine the cybersecurity risks that arise with the adoption of AI in wireless networks, and the necessary regulations that will help address these risks. A vision of the future of AI in wireless networks, and a discussion of the open research challenges from technological and legal points of view conclude the thesis.This thesis explores how artificial intelligence (AI) will benefit next-generation wireless networks focusing both on the technological aspects and legal implications. At first, a summary of AI history is provided, together with an overview of the different AI methodologies. Among them, the thesis focuses on machine learning-based approaches and in particular neural network algorithms for wireless networks. The second chapter examines how AI can support wireless network management and explores the advantages of adopting this new paradigm as a substitute for current non-data-driven approaches. Next, we discuss the technical challenges that should be addressed for the practical integration of AI within wireless networks, starting from the huge amount of data needed to properly configure AI methodologies and the high computational demand. Emerging approaches that will allow overcoming the above-mentioned challenges, such as, e.g., the placement of computational servers near the base stations and the adoption of federated learning techniques, are also discussed. In the third chapter, we examine the cybersecurity risks that arise with the adoption of AI in wireless networks, and the necessary regulations that will help address these risks. A vision of the future of AI in wireless networks, and a discussion of the open research challenges from technological and legal points of view conclude the thesis

    Integrated Sensing and Communications for 3D Object Imaging via Bilinear Inference

    Full text link
    We consider an uplink integrated sensing and communications (ISAC) scenario where the detection of data symbols from multiple user equipment (UEs) occurs simultaneously with a three-dimensional (3D) estimation of the environment, extracted from the scattering features present in the channel state information (CSI) and utilizing the same physical layer communications air interface, as opposed to radar technologies. By exploiting a discrete (voxelated) representation of the environment, two novel ISAC schemes are derived with purpose-built message passing (MP) rules for the joint estimation of data symbols and status (filled/empty) of the discretized environment. The first relies on a modular feedback structure in which the data symbols and the environment are estimated alternately, whereas the second leverages a bilinear inference framework to estimate both variables concurrently. Both contributed methods are shown via simulations to outperform the state-of-the-art (SotA) in accurately recovering the transmitted data as well as the 3D image of the environment. An analysis of the computational complexities of the proposed methods reveals distinct advantages of each scheme, namely, that the bilinear solution exhibits a superior robustness to short pilots and channel blockages, while the alternating solution offers lower complexity with large number of UEs and superior performance in ideal conditions

    Artificial Intelligence Future Capability Paper

    Get PDF
    corecore