12,947 research outputs found

    2D Proactive Uplink Resource Allocation Algorithm for Event Based MTC Applications

    Full text link
    We propose a two dimension (2D) proactive uplink resource allocation (2D-PURA) algorithm that aims to reduce the delay/latency in event-based machine-type communications (MTC) applications. Specifically, when an event of interest occurs at a device, it tends to spread to the neighboring devices. Consequently, when a device has data to send to the base station (BS), its neighbors later are highly likely to transmit. Thus, we propose to cluster devices in the neighborhood around the event, also referred to as the disturbance region, into rings based on the distance from the original event. To reduce the uplink latency, we then proactively allocate resources for these rings. To evaluate the proposed algorithm, we analytically derive the mean uplink delay, the proportion of resource conservation due to successful allocations, and the proportion of uplink resource wastage due to unsuccessful allocations for 2D-PURA algorithm. Numerical results demonstrate that the proposed method can save over 16.5 and 27 percent of mean uplink delay, compared with the 1D algorithm and the standard method, respectively.Comment: 6 pages, 6 figures, Published in 2018 IEEE Wireless Communications and Networking Conference (WCNC

    NOMA based resource allocation and mobility enhancement framework for IoT in next generation cellular networks

    Get PDF
    With the unprecedented technological advances witnessed in the last two decades, more devices are connected to the internet, forming what is called internet of things (IoT). IoT devices with heterogeneous characteristics and quality of experience (QoE) requirements may engage in dynamic spectrum market due to scarcity of radio resources. We propose a framework to efficiently quantify and supply radio resources to the IoT devices by developing intelligent systems. The primary goal of the paper is to study the characteristics of the next generation of cellular networks with non-orthogonal multiple access (NOMA) to enable connectivity to clustered IoT devices. First, we demonstrate how the distribution and QoE requirements of IoT devices impact the required number of radio resources in real time. Second, we prove that using an extended auction algorithm by implementing a series of complementary functions, enhance the radio resource utilization efficiency. The results show substantial reduction in the number of sub-carriers required when compared to conventional orthogonal multiple access (OMA) and the intelligent clustering is scalable and adaptable to the cellular environment. Ability to move spectrum usages from one cluster to other clusters after borrowing when a cluster has less user or move out of the boundary is another soft feature that contributes to the reported radio resource utilization efficiency. Moreover, the proposed framework provides IoT service providers cost estimation to control their spectrum acquisition to achieve required quality of service (QoS) with guaranteed bit rate (GBR) and non-guaranteed bit rate (Non-GBR)

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Fronthaul-Constrained Cloud Radio Access Networks: Insights and Challenges

    Full text link
    As a promising paradigm for fifth generation (5G) wireless communication systems, cloud radio access networks (C-RANs) have been shown to reduce both capital and operating expenditures, as well as to provide high spectral efficiency (SE) and energy efficiency (EE). The fronthaul in such networks, defined as the transmission link between a baseband unit (BBU) and a remote radio head (RRH), requires high capacity, but is often constrained. This article comprehensively surveys recent advances in fronthaul-constrained C-RANs, including system architectures and key techniques. In particular, key techniques for alleviating the impact of constrained fronthaul on SE/EE and quality of service for users, including compression and quantization, large-scale coordinated processing and clustering, and resource allocation optimization, are discussed. Open issues in terms of software-defined networking, network function virtualization, and partial centralization are also identified.Comment: 5 Figures, accepted by IEEE Wireless Communications. arXiv admin note: text overlap with arXiv:1407.3855 by other author
    • …
    corecore