721 research outputs found

    A New Cell Association Scheme In Heterogeneous Networks

    Full text link
    Cell association scheme determines which base station (BS) and mobile user (MU) should be associated with and also plays a significant role in determining the average data rate a MU can achieve in heterogeneous networks. However, the explosion of digital devices and the scarcity of spectra collectively force us to carefully re-design cell association scheme which was kind of taken for granted before. To address this, we develop a new cell association scheme in heterogeneous networks based on joint consideration of the signal-to-interference-plus-noise ratio (SINR) which a MU experiences and the traffic load of candidate BSs1. MUs and BSs in each tier are modeled as several independent Poisson point processes (PPPs) and all channels experience independently and identically distributed ( i.i.d.) Rayleigh fading. Data rate ratio and traffic load ratio distributions are derived to obtain the tier association probability and the average ergodic MU data rate. Through numerical results, We find that our proposed cell association scheme outperforms cell range expansion (CRE) association scheme. Moreover, results indicate that allocating small sized and high-density BSs will improve spectral efficiency if using our proposed cell association scheme in heterogeneous networks.Comment: Accepted by IEEE ICC 2015 - Next Generation Networking Symposiu

    Spatial spectrum and energy efficiency of random cellular networks

    Get PDF
    It is a great challenge to evaluate the network performance of cellular mobile communication systems. In this paper, we propose new spatial spectrum and energy efficiency models for Poisson-Voronoi tessellation (PVT) random cellular networks. To evaluate the user access the network, a Markov chain based wireless channel access model is first proposed for PVT random cellular networks. On that basis, the outage probability and blocking probability of PVT random cellular networks are derived, which can be computed numerically. Furthermore, taking into account the call arrival rate, the path loss exponent and the base station (BS) density in random cellular networks, spatial spectrum and energy efficiency models are proposed and analyzed for PVT random cellular networks. Numerical simulations are conducted to evaluate the network spectrum and energy efficiency in PVT random cellular networks.Comment: appears in IEEE Transactions on Communications, April, 201

    Deep Learning-Based Modeling of 5G Core Control Plane for 5G Network Digital Twin

    Full text link
    Digital twin is a key enabler to facilitate the development and implementation of new technologies in 5G and beyond networks. However, the complex structure and diverse functions of the current 5G core network, especially the control plane, lead to difficulties in building the core network of the digital twin. In this paper, we propose two novel data-driven architectures for modeling the 5G control plane and implement corresponding deep learning models, namely 5GC-Seq2Seq and 5GC-former, based on the Vanilla Seq2Seq model and Transformer decoder respectively. To train and test models, we also present a solution that allows the signaling messages to be interconverted with vectors, which can be utilized in dataset construction. The experiments are based on 5G core network signaling data collected by the Spirent C50 network tester, including various procedures related to registration, handover, PDU sessions, etc. Our results show that 5GC-Seq2Seq achieves over 99.98% F1-score (A metric to measure the accuracy of positive samples) with a relatively simple structure, while 5GC-former attains higher than 99.998% F1-score by establishing a more complex and highly parallel model, indicating that the method proposed in this paper reproduces the major functions of the core network control plane in 5G digital twin with high accuracy

    Wireless Network Digital Twin for 6G: Generative AI as A Key Enabler

    Full text link
    Digital twin, which enables emulation, evaluation, and optimization of physical entities through synchronized digital replicas, has gained increasingly attention as a promising technology for intricate wireless networks. For 6G, numerous innovative wireless technologies and network architectures have posed new challenges in establishing wireless network digital twins. To tackle these challenges, artificial intelligence (AI), particularly the flourishing generative AI, emerges as a potential solution. In this article, we discuss emerging prerequisites for wireless network digital twins considering the complicated network architecture, tremendous network scale, extensive coverage, and diversified application scenarios in the 6G era. We further explore the applications of generative AI, such as transformer and diffusion model, to empower the 6G digital twin from multiple perspectives including implementation, physical-digital synchronization, and slicing capability. Subsequently, we propose a hierarchical generative AI-enabled wireless network digital twin at both the message-level and policy-level, and provide a typical use case with numerical results to validate the effectiveness and efficiency. Finally, open research issues for wireless network digital twins in the 6G era are discussed
    corecore