2,868 research outputs found

    Enhancing Radio Access Network Performance over LTE-A for Machine-to-Machine Communications under Massive Access

    Get PDF
    The expected tremendous growth of machine-to-machine (M2M) devices will require solutions to improve random access channel (RACH) performance. Recent studies have shown that radio access network (RAN) performance is degraded under the high density of devices. In this paper, we propose three methods to enhance RAN performance for M2M communications over the LTE-A standard. The first method employs a different value for the physical RACH configuration index to increase random access opportunities. The second method addresses a heterogeneous network by using a number of picocells to increase resources and offload control traffic from the macro base station. The third method involves aggregation points and addresses their effect on RAN performance. Based on evaluation results, our methods improved RACH performance in terms of the access success probability and average access delay

    Enhancing Radio Access Network Performance over LTE-A for Machine-to-Machine Communications under Massive Access

    Get PDF
    The expected tremendous growth of machine-to-machine (M2M) devices will require solutions to improve random access channel (RACH) performance. Recent studies have shown that radio access network (RAN) performance is degraded under the high density of devices. In this paper, we propose three methods to enhance RAN performance for M2M communications over the LTE-A standard. The first method employs a different value for the physical RACH configuration index to increase random access opportunities. The second method addresses a heterogeneous network by using a number of picocells to increase resources and offload control traffic from the macro base station. The third method involves aggregation points and addresses their effect on RAN performance. Based on evaluation results, our methods improved RACH performance in terms of the access success probability and average access delay

    Next Generation M2M Cellular Networks: Challenges and Practical Considerations

    Get PDF
    In this article, we present the major challenges of future machine-to-machine (M2M) cellular networks such as spectrum scarcity problem, support for low-power, low-cost, and numerous number of devices. As being an integral part of the future Internet-of-Things (IoT), the true vision of M2M communications cannot be reached with conventional solutions that are typically cost inefficient. Cognitive radio concept has emerged to significantly tackle the spectrum under-utilization or scarcity problem. Heterogeneous network model is another alternative to relax the number of covered users. To this extent, we present a complete fundamental understanding and engineering knowledge of cognitive radios, heterogeneous network model, and power and cost challenges in the context of future M2M cellular networks

    Data Aggregation and Packet Bundling of Uplink Small Packets for Monitoring Applications in LTE

    Full text link
    In cellular massive Machine-Type Communications (MTC), a device can transmit directly to the base station (BS) or through an aggregator (intermediate node). While direct device-BS communication has recently been in the focus of 5G/3GPP research and standardization efforts, the use of aggregators remains a less explored topic. In this paper we analyze the deployment scenarios in which aggregators can perform cellular access on behalf of multiple MTC devices. We study the effect of packet bundling at the aggregator, which alleviates overhead and resource waste when sending small packets. The aggregators give rise to a tradeoff between access congestion and resource starvation and we show that packet bundling can minimize resource starvation, especially for smaller numbers of aggregators. Under the limitations of the considered model, we investigate the optimal settings of the network parameters, in terms of number of aggregators and packet-bundle size. Our results show that, in general, data aggregation can benefit the uplink massive MTC in LTE, by reducing the signalling overhead.Comment: to appear in IEEE Networ

    Goodbye, ALOHA!

    Get PDF
    ©2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.The vision of the Internet of Things (IoT) to interconnect and Internet-connect everyday people, objects, and machines poses new challenges in the design of wireless communication networks. The design of medium access control (MAC) protocols has been traditionally an intense area of research due to their high impact on the overall performance of wireless communications. The majority of research activities in this field deal with different variations of protocols somehow based on ALOHA, either with or without listen before talk, i.e., carrier sensing multiple access. These protocols operate well under low traffic loads and low number of simultaneous devices. However, they suffer from congestion as the traffic load and the number of devices increase. For this reason, unless revisited, the MAC layer can become a bottleneck for the success of the IoT. In this paper, we provide an overview of the existing MAC solutions for the IoT, describing current limitations and envisioned challenges for the near future. Motivated by those, we identify a family of simple algorithms based on distributed queueing (DQ), which can operate for an infinite number of devices generating any traffic load and pattern. A description of the DQ mechanism is provided and most relevant existing studies of DQ applied in different scenarios are described in this paper. In addition, we provide a novel performance evaluation of DQ when applied for the IoT. Finally, a description of the very first demo of DQ for its use in the IoT is also included in this paper.Peer ReviewedPostprint (author's final draft

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig
    corecore