1,149 research outputs found

    Enabling RAN Slicing Through Carrier Aggregation in mmWave Cellular Networks

    Full text link
    The ever increasing number of connected devices and of new and heterogeneous mobile use cases implies that 5G cellular systems will face demanding technical challenges. For example, Ultra-Reliable Low-Latency Communication (URLLC) and enhanced Mobile Broadband (eMBB) scenarios present orthogonal Quality of Service (QoS) requirements that 5G aims to satisfy with a unified Radio Access Network (RAN) design. Network slicing and mmWave communications have been identified as possible enablers for 5G. They provide, respectively, the necessary scalability and flexibility to adapt the network to each specific use case environment, and low latency and multi-gigabit-per-second wireless links, which tap into a vast, currently unused portion of the spectrum. The optimization and integration of these technologies is still an open research challenge, which requires innovations at different layers of the protocol stack. This paper proposes to combine them in a RAN slicing framework for mmWaves, based on carrier aggregation. Notably, we introduce MilliSlice, a cross-carrier scheduling policy that exploits the diversity of the carriers and maximizes their utilization, thus simultaneously guaranteeing high throughput for the eMBB slices and low latency and high reliability for the URLLC flows.Comment: 8 pages, 8 figures. Proc. of the 18th Mediterranean Communication and Computer Networking Conference (MedComNet 2020), Arona, Italy, 202

    Modeling, Analysis and Design for Carrier Aggregation in Heterogeneous Cellular Networks

    Full text link
    Carrier aggregation (CA) and small cells are two distinct features of next-generation cellular networks. Cellular networks with small cells take on a very heterogeneous characteristic, and are often referred to as HetNets. In this paper, we introduce a load-aware model for CA-enabled \textit{multi}-band HetNets. Under this model, the impact of biasing can be more appropriately characterized; for example, it is observed that with large enough biasing, the spectral efficiency of small cells may increase while its counterpart in a fully-loaded model always decreases. Further, our analysis reveals that the peak data rate does not depend on the base station density and transmit powers; this strongly motivates other approaches e.g. CA to increase the peak data rate. Last but not least, different band deployment configurations are studied and compared. We find that with large enough small cell density, spatial reuse with small cells outperforms adding more spectrum for increasing user rate. More generally, universal cochannel deployment typically yields the largest rate; and thus a capacity loss exists in orthogonal deployment. This performance gap can be reduced by appropriately tuning the HetNet coverage distribution (e.g. by optimizing biasing factors).Comment: submitted to IEEE Transactions on Communications, Nov. 201

    Resource and power management in next generation networks

    Get PDF
    The limits of today’s cellular communication systems are constantly being tested by the exponential increase in mobile data traffic, a trend which is poised to continue well into the next decade. Densification of cellular networks, by overlaying smaller cells, i.e., micro, pico and femtocells, over the traditional macrocell, is seen as an inevitable step in enabling future networks to support the expected increases in data rate demand. Next generation networks will most certainly be more heterogeneous as services will be offered via various types of points of access (PoAs). Indeed, besides the traditional macro base station, it is expected that users will also be able to access the network through a wide range of other PoAs: WiFi access points, remote radio-heads (RRHs), small cell (i.e., micro, pico and femto) base stations or even other users, when device-to-device (D2D) communications are supported, creating thus a multi-tiered network architecture. This approach is expected to enhance the capacity of current cellular networks, while patching up potential coverage gaps. However, since available radio resources will be fully shared, the inter-cell interference as well as the interference between the different tiers will pose a significant challenge. To avoid severe degradation of network performance, properly managing the interference is essential. In particular, techniques that mitigate interference such Inter Cell Interference Coordination (ICIC) and enhanced ICIC (eICIC) have been proposed in the literature to address the issue. In this thesis, we argue that interference may be also addressed during radio resource scheduling tasks, by enabling the network to make interference-aware resource allocation decisions. Carrier aggregation technology, which allows the simultaneous use of several component carriers, on the other hand, targets the lack of sufficiently large portions of frequency spectrum; a problem that severely limits the capacity of wireless networks. The aggregated carriers may, in general, belong to different frequency bands, and have different bandwidths, thus they also may have very different signal propagation characteristics. Integration of carrier aggregation in the network introduces additional tasks and further complicates interference management, but also opens up a range of possibilities for improving spectrum efficiency in addition to enhancing capacity, which we aim to exploit. In this thesis, we first look at the resource allocation in problem in dense multitiered networks with support for advanced features such as carrier aggregation and device-to-device communications. For two-tiered networks with D2D support, we propose a centralised, near optimal algorithm, based on dynamic programming principles, that allows a central scheduler to make interference and traffic-aware scheduling decisions, while taking into consideration the short-lived nature of D2D links. As the complexity of the central scheduler increases exponentially with the number of component carriers, we further propose a distributed heuristic algorithm to tackle the resource allocation problem in carrier aggregation enabled dense networks. We show that the solutions we propose perform significantly better than standard solutions adopted in cellular networks such as eICIC coupled with Proportional Fair scheduling, in several key metrics such as user throughput, timely delivery of content and spectrum and energy efficiency, while ensuring fairness for backward compatible devices. Next, we investigate the potentiality to enhance network performance by enabling the different nodes of the network to reduce and dynamically adjust the transmit power of the different carriers to mitigate interference. Considering that the different carriers may have different coverage areas, we propose to leverage this diversity, to obtain high-performing network configurations. Thus, we model the problem of carrier downlink transmit power setting, as a competitive game between teams of PoAs, which enables us to derive distributed dynamic power setting algorithms. Using these algorithms we reach stable configurations in the network, known as Nash equilibria, which we show perform significantly better than fixed power strategies coupled with eICIC

    Dual connectivity for LTE-advanced heterogeneous networks

    Get PDF

    Optimizations in Heterogeneous Mobile Networks

    Get PDF

    Review on Radio Resource Allocation Optimization in LTE/LTE-Advanced using Game Theory

    Get PDF
    Recently, there has been a growing trend toward ap-plying game theory (GT) to various engineering fields in order to solve optimization problems with different competing entities/con-tributors/players. Researches in the fourth generation (4G) wireless network field also exploited this advanced theory to overcome long term evolution (LTE) challenges such as resource allocation, which is one of the most important research topics. In fact, an efficient de-sign of resource allocation schemes is the key to higher performance. However, the standard does not specify the optimization approach to execute the radio resource management and therefore it was left open for studies. This paper presents a survey of the existing game theory based solution for 4G-LTE radio resource allocation problem and its optimization
    • …
    corecore