36 research outputs found

    Resource Allocation in Uplink Long Term Evolution

    Get PDF
    One of the most crucial goals of future cellular systems is to minimize transmission power while increasing system performance. This master thesis work presents two channel-queue-aware scheduling schemes to allocate channels among active users in uplink LTE. Transmission power, packet delays and data rates are three of the most important criteria critically affecting the resource allocation designs. Therefore, each of these two scheduling algorithms proposes a practical method that assigns resources in such a way so as to optimally maximize data rate and minimize transmission power and packet delays while ensuring the QoS requirements. After converting the resource allocation problem into an optimization problem, the objective function and associated constraints are derived. Due to the contiguity constraint, which is imposed by SC-FDMA in uplink LTE, binary integer programming is employed to solve the optimization problem. Also the heuristic algorithms that approximate optimal schemes are presented to decrease the algorithm complexity

    Cross-layer Optimization for Video Delivery over Wireless Networks

    Get PDF
    As video streaming is becoming the most popular application of Internet mo- bile, the design and the optimization of video communications over wireless networks is attracting increasingly attention from both academia and indus- try. The main challenges are to enhance the quality of service support, and to dynamically adapt the transmitted video streams to the network condition. The cross-layer methods, i.e., the exchange of information among different layers of the system, is one of the key concepts to be exploited to achieve this goals. In this thesis we propose novel cross-layer optimization frameworks for scalable video coding (SVC) delivery and for HTTP adaptive streaming (HAS) application over the downlink and the uplink of Long Term Evolution (LTE) wireless networks. They jointly address optimized content-aware rate adaptation and radio resource allocation (RRA) with the aim of maximiz- ing the sum of the achievable rates while minimizing the quality difference among multiple videos. For multi-user SVC delivery over downlink wireless systems, where IP/TV is the most representative application, we decompose the optimization problem and we propose the novel iterative local approxi- mation algorithm to derive the optimal solution, by also presenting optimal algorithms to solve the resulting two sub-problems. For multiple SVC de- livery over uplink wireless systems, where healt-care services are the most attractive and challenging application, we propose joint video adaptation and aggregation directly performed at the application layer of the transmit- ting equipment, which exploits the guaranteed bit-rate (GBR) provided by the low-complexity sub-optimal RRA solutions proposed. Finally, we pro- pose a quality-fair adaptive streaming solution to deliver fair video quality to HAS clients in a LTE cell by adaptively selecting the prescribed (GBR) of each user according to the video content in addition to the channel condi- tion. Extensive numerical evaluations show the significant enhancements of the proposed strategies with respect to other state-of-the-art frameworks

    Recent advances in radio resource management for heterogeneous LTE/LTE-A networks

    Get PDF
    As heterogeneous networks (HetNets) emerge as one of the most promising developments toward realizing the target specifications of Long Term Evolution (LTE) and LTE-Advanced (LTE-A) networks, radio resource management (RRM) research for such networks has, in recent times, been intensively pursued. Clearly, recent research mainly concentrates on the aspect of interference mitigation. Other RRM aspects, such as radio resource utilization, fairness, complexity, and QoS, have not been given much attention. In this paper, we aim to provide an overview of the key challenges arising from HetNets and highlight their importance. Subsequently, we present a comprehensive survey of the RRM schemes that have been studied in recent years for LTE/LTE-A HetNets, with a particular focus on those for femtocells and relay nodes. Furthermore, we classify these RRM schemes according to their underlying approaches. In addition, these RRM schemes are qualitatively analyzed and compared to each other. We also identify a number of potential research directions for future RRM development. Finally, we discuss the lack of current RRM research and the importance of multi-objective RRM studies

    Efficient Scheduling Algorithms for Wireless Resource Allocation and Virtualization in Wireless Networks

    Get PDF
    The continuing growth in demand for better mobile broadband experiences has motivated rapid development of radio-access technologies to support high data rates and improve quality of service (QoS) and quality of experience (QoE) for mobile users. However, the modern radio-access technologies pose new challenges to mobile network operators (MNO) and wireless device designers such as reducing the total cost of ownership while supporting high data throughput per user, and extending battery life-per-charge of the mobile devices. In this thesis, a variety of optimization techniques aimed at providing innovative solutions for such challenges are explored. The thesis is divided into two parts. In the first part, the challenge of extending battery life-per-charge is addressed. Optimal and suboptimal power-efficient schedulers that minimize the total transmit power and meet the QoS requirements of the users are presented. The second outlines the benefits and challenges of deploying wireless resource virtualization (WRV) concept as a promising solution for satisfying the growing demand for mobile data and reducing capital and operational costs. First, a WRV framework is proposed for single cell zone that is able to centralize and share the spectrum resources between multiple MNOs. Consequently, several WRV frameworks are proposed, which virtualize the spectrum resource of the entire network for cloud radio access network (C-RAN)- one of the front runners for the next generation network architecture. The main contributions of this thesis are in designing optimal and suboptimal solutions for the aforementioned challenges. In most cases, the optimal solutions suffer from high complexity, and therefore low-complexity suboptimal solutions are provided for practical systems. The optimal solutions are used as benchmarks for evaluating the suboptimal solutions. The results prove that the proposed solutions effectively contribute in addressing the challenges caused by the demand for high data rates and power transmission in mobile networks

    Scheduling and Link Adaptation for Uplink SC-FDMA Systems - A LTE Case Study

    Get PDF

    Efficient and Virtualized Scheduling for OFDM-Based High Mobility Wireless Communications Objects

    Get PDF
    Services providers (SPs) in the radio platform technology standard long term evolution (LTE) systems are enduring many challenges in order to accommodate the rapid expansion of mobile data usage. The modern technologies demonstrate new challenges to SPs, for example, reducing the cost of the capital and operating expenditures while supporting high data throughput per customer, extending battery life-per-charge of the cell phone devices, and supporting high mobility communications with fast and seamless handover (HO) networking architecture. In this thesis, a variety of optimized techniques aimed at providing innovative solutions for such challenges are explored. The thesis is divided into three parts. The first part outlines the benefits and challenges of deploying virtualized resource sharing concept. Wherein, SPs achieving a different schedulers policy are sharing evolved network B, allowing SPs to customize their efforts and provide service requirements; as a promising solution for reducing operational and capital expenditures, leading to potential energy savings, and supporting higher peak rates. The second part, formulates the optimized power allocation problem in a virtualized scheme in LTE uplink systems, aiming to extend the mobile devices’ battery utilization time per charge. While, the third part extrapolates a proposed hybrid-HO (HY-HO) technique, that can enhance the system performance in terms of latency and HO reliability at cell boundary for high mobility objects (up to 350 km/hr; wherein, HO will occur more frequent). The main contributions of this thesis are in designing optimal binary integer programmingbased and suboptimal heuristic (with complexity reduction) scheduling algorithms subject to exclusive and contiguous allocation, maximum transmission power, and rate constraints. Moreover, designing the HY-HO based on the combination of soft and hard HO was able to enhance the system performance in term of latency, interruption time and reliability during HO. The results prove that the proposed solutions effectively contribute in addressing the challenges caused by the demand for high data rates and power transmission in mobile networks especially in virtualized resources sharing scenarios that can support high data rates with improving quality of services (QoSs)

    Opportunistic packet scheduling algorithms for beyond 3G wireless networks

    Get PDF
    The new millennium has been labeled as the century of the personal communications revolution, or more specifically, the digital wireless communications revolution. The introduction of new multimedia services has created higher loads on available radio resources. Namely, the task of the radio resource manager is to deliver different levels of quality for these multimedia services. Radio resources are scarce and need to be shared by many users. This sharing has to be carried out in an efficient way avoiding, as much as possible, any waste in resources. A Heuristic scheduler for SC-FDMA systems is proposed where the main objective is to organize scheduling in a way that maximizes a collective utility function. The heuristic is later extended to a multi-cell system where scheduling is coordinated between neighboring cells to limit interference. Inter-cell interference coordination is also examined with game theory to find the optimal resource allocation among cells in terms of frequency bands allocated to cell edge users who suffer the most from interference. Activity control of users is examined in scheduling and admission control where in the admission part, the controller gradually integrates a new user into the system by probing to find the effect of the new user on existing connections. In the scheduling part, the activity of users is adjusted according to the proximity to a requested quality of service level. Finally, a study is made about feedback information in multi-carrier systems due to its importance in maximizing the performance of opportunistic networks

    Statistical priority-based uplink scheduling for M2M communications

    Get PDF
    Currently, the worldwide network is witnessing major efforts to transform it from being the Internet of humans only to becoming the Internet of Things (IoT). It is expected that Machine Type Communication Devices (MTCDs) will overwhelm the cellular networks with huge traffic of data that they collect from their environments to be sent to other remote MTCDs for processing thus forming what is known as Machine-to-Machine (M2M) communications. Long Term Evolution (LTE) and LTE-Advanced (LTE-A) appear as the best technology to support M2M communications due to their native IP support. LTE can provide high capacity, flexible radio resource allocation and scalability, which are the required pillars for supporting the expected large numbers of deployed MTCDs. Supporting M2M communications over LTE faces many challenges. These challenges include medium access control and the allocation of radio resources among MTCDs. The problem of radio resources allocation, or scheduling, originates from the nature of M2M traffic. This traffic consists of a large number of small data packets, with specific deadlines, generated by a potentially massive number of MTCDs. M2M traffic is therefore mostly in the uplink direction, i.e. from MTCDs to the base station (known as eNB in LTE terminology). These characteristics impose some design requirements on M2M scheduling techniques such as the need to use insufficient radio resources to transmit a huge amount of traffic within certain deadlines. This presents the main motivation behind this thesis work. In this thesis, we introduce a novel M2M scheduling scheme that utilizes what we term the “statistical priority” in determining the importance of information carried by data packets. Statistical priority is calculated based on the statistical features of the data such as value similarity, trend similarity and auto-correlation. These calculations are made and then reported by the MTCDs to the serving eNBs along with other reports such as channel state. Statistical priority is then used to assign priorities to data packets so that the scarce radio resources are allocated to the MTCDs that are sending statistically important information. This would help avoid exploiting limited radio resources to carry redundant or repetitive data which is a common situation in M2M communications. In order to validate our technique, we perform a simulation-based comparison among the main scheduling techniques and our proposed statistical priority-based scheduling technique. This comparison was conducted in a network that includes different types of MTCDs, such as environmental monitoring sensors, surveillance cameras and alarms. The results show that our proposed statistical priority-based scheduler outperforms the other schedulers in terms of having the least losses of alarm data packets and the highest rate in sending critical data packets that carry non-redundant information for both environmental monitoring and video traffic. This indicates that the proposed technique is the most efficient in the utilization of limited radio resources as compared to the other techniques

    On the Feasibility of Utilizing Commercial 4G LTE Systems for Misson-Critical IoT Applications

    Full text link
    Emerging Internet of Things (IoT) applications and services including e-healthcare, intelligent transportation systems, smart grid, and smart homes to smart cities to smart workplace, are poised to become part of every aspect of our daily lives. The IoT will enable billions of sensors, actuators, and smart devices to be interconnected and managed remotely via the Internet. Cellular-based Machine-to-Machine (M2M) communications is one of the key IoT enabling technologies with huge market potential for cellular service providers deploying Long Term Evolution (LTE) networks. There is an emerging consensus that Fourth Generation (4G) and 5G cellular technologies will enable and support these applications, as they will provide the global mobile connectivity to the anticipated tens of billions of things/devices that will be attached to the Internet. Many vital utilities and service industries are considering the use of commercially available LTE cellular networks to provide critical connections to users, sensors, and smart M2M devices on their networks, due to its low cost and availability. Many of these emerging IoT applications are mission-critical with stringent requirements in terms of reliability and end-to-end (E2E) delay bound. The delay bound specified for each application refers to the device-to-device latencies, which is defined as the combined delay resulting from both application level processing time and communication latency. Each IoT application has its own distinct performance requirements in terms of latency, availability, and reliability. Typically, uplink (UL) traffic of most of these IoT applications is the dominant network traffic (much higher than total downlink (DL) traffic). Thus, efficient LTE UL scheduling algorithms at the base station (“Evolved NodeB (eNB)” per 3GPP standards) are more critical for M2M applications. LTE, however, was not originally intended for IoT applications, where traffic generated by M2M devices (running IoT applications) has totally different characteristics than those from traditional Human-to-Human (H2H)-based voice/video and data communications. In addition, due to the anticipated massive deployment of M2M devices and the limited available radio spectrum, the problem of efficient radio resources management (RRM) and UL scheduling poses a serious challenge in adopting LTE for M2M communications. Existing LTE quality of service (QoS) standard and UL scheduling algorithms were mainly optimized for H2H services and can’t accommodate such a wide range of diverging performance requirements of these M2M-based IoT applications. Though 4G LTE networks can support very low Packet Loss Ratio (PLR) at the physical layer, such reliability, however, comes at the expense of increased latency from tens to hundreds of ms due to the aggressive use of retransmission mechanisms. Current 4G LTE technologies may satisfy a single performance metric of these mission critical applications, but not the simultaneous support of ultra-high reliability and low latency as well as high data rates. Numerous QoS aware LTE UL scheduling algorithms for supporting M2M applications as well as H2H services have been reported in the literature. Most of these algorithms, however, were not intended for the support of mission critical IoT applications, as they are not latency-aware. In addition, these algorithms are simplified and don’t fully conform to LTE’s signaling and QoS standards. For instance, a common practice is the assumption that the time domain UL scheduler located at the eNB prioritizes user equipment (UEs)/M2M devices connection requests based on the head-of-line (HOL) packet waiting time at the UE/device transmission buffer. However, as will be detailed below, LTE standard does not support a mechanism that enables the UEs/devices to inform the eNB uplink scheduler about the waiting time of uplink packets residing in their transmission buffers. Ultra-Reliable Low-Latency Communication (URLLC) paradigm has recently emerged to enable a new range of mission-critical applications and services including industrial automation, real-time operation and control of the smart grid, inter-vehicular communications for improved safety and self-deriving vehicles. URLLC is one of the most innovative 5G New Radio (NR) features. URLLC and its supporting 5G NR technologies might become a commercial reality in the future, but it may be rather a distant future. Thus, deploying viable mission critical IoT applications will have to be postponed until URLLC and 5G NR technologies are commercially feasible. Because IoT applications, specifically mission critical, will have a significant impact on the welfare of all humanity, the immediate or near-term deployments of these applications is of utmost importance. It is the purpose of this thesis to explore whether current commercial 4G LTE cellular networks have the potential to support some of the emerging mission critical IoT applications. Smart grid is selected in this work as an illustrative IoT example because it is one of the most demanding IoT applications, as it includes diverse use cases ranging from mission-critical applications that have stringent requirements in terms of E2E latency and reliability to those that require support of massive number of connected M2M devices with relaxed latency and reliability requirements. The purpose of thesis is two fold: First, a user-friendly MATLAB-based open source software package to model commercial 4G LTE systems is developed. In contrast to mainstream commercial LTE software packages, the developed package is specifically tailored to accurately model mission critical IoT applications and above all fully conforms to commercial 4G LTE signaling and QoS standards. Second, utilizing the developed software package, we present a detailed realistic LTE UL performance analysis to assess the feasibility of commercial 4G LTE cellular networks when used to support such a diverse set of emerging IoT applications as well as typical H2H services

    Optimized resource allocation techniques for critical machine-type communications in mixed LTE networks

    Get PDF
    To implement the revolutionary Internet of Things (IoT) paradigm, the evolution of the communication networks to incorporate machine-type communications (MTC), in addition to conventional human-type communications (HTC) has become inevitable. Critical MTC, in contrast to massive MTC, represents that type of communications that requires high network availability, ultra-high reliability, very low latency, and high security, to enable what is known as mission-critical IoT. Due to the fact that cellular networks are considered one of the most promising wireless technologies to serve critical MTC, the International Telecommunication Union (ITU) targets critical MTC as a major use case, along with the enhanced mobile broadband (eMBB) and massive MTC, in the design of the upcoming generation of cellular networks. Therefore, the Third Generation Partnership Project (3GPP) is evolving the current Long-Term Evolution (LTE) standard to efficiently serve critical MTC to fulfill the fifth-generation (5G) requirements using the evolved LTE (eLTE) in addition to the new radio (NR). In this regard, 3GPP has introduced several enhancements in the latest releases to support critical MTC in LTE, which is designed mainly for HTC. However, guaranteeing stringent quality-of-service (QoS) for critical MTC while not sacrificing that of conventional HTC is a challenging task from the radio resource management perspective. In this dissertation, we optimize the resource allocation and scheduling process for critical MTC in mixed LTE networks in different operational and implementation cases. We target maximizing the overall system utility while providing accurate guarantees for the QoS requirements of critical MTC, through a cross-layer design, and that of HTC as well. For this purpose, we utilize advanced techniques from the queueing theory and mathematical optimization. In addition, we adopt heuristic approaches and matching-based techniques to design computationally-efficient resource allocation schemes to be used in practice. In this regard, we analyze the proposed methods from a practical perspective. Furthermore, we run extensive simulations to evaluate the performance of the proposed techniques, validate the theoretical analysis, and compare the performance with other schemes. The simulation results reveal a close-to-optimal performance for the proposed algorithms while outperforming other techniques from the literature
    corecore