197 research outputs found

    Optimal Compression and Transmission Rate Control for Node-Lifetime Maximization

    Get PDF
    We consider a system that is composed of an energy constrained sensor node and a sink node, and devise optimal data compression and transmission policies with an objective to prolong the lifetime of the sensor node. While applying compression before transmission reduces the energy consumption of transmitting the sensed data, blindly applying too much compression may even exceed the cost of transmitting raw data, thereby losing its purpose. Hence, it is important to investigate the trade-off between data compression and transmission energy costs. In this paper, we study the joint optimal compression-transmission design in three scenarios which differ in terms of the available channel information at the sensor node, and cover a wide range of practical situations. We formulate and solve joint optimization problems aiming to maximize the lifetime of the sensor node whilst satisfying specific delay and bit error rate (BER) constraints. Our results show that a jointly optimized compression-transmission policy achieves significantly longer lifetime (90% to 2000%) as compared to optimizing transmission only without compression. Importantly, this performance advantage is most profound when the delay constraint is stringent, which demonstrates its suitability for low latency communication in future wireless networks.Comment: accepted for publication in IEEE Transactions on Wireless Communicaiton

    Analysis and Design of Communication Policies for Energy-Constrained Machine-Type Devices

    Get PDF
    This thesis focuses on the modelling, analysis and design of novel communication strategies for wireless machine-type communication (MTC) systems to realize the notion of Internet of things (IoT). We consider sensor based MTC devices which acquire physical information from the environment and transmit it to a base station (BS) while satisfying application specific quality-of-service (QoS) requirements. Due to the wireless and unattended operation, these MTC devices are mostly battery-operated and are severely energy-constrained. In addition, MTC systems require low-latency, perpetual operation, massive-access, etc. Motivated by these critical requirements, this thesis proposes optimal data communication policies for four different network scenarios. In the first two scenarios, each MTC device transmits data on a dedicated orthogonal channel and either (i) possess an initially fully charged battery of finite capacity, or (ii) possess the ability to harvest energy and store it in a battery of finite capacity. In the other two scenarios, all MTC devices share a single channel and are either (iii) allocated individual non-overlapping transmission times, or (iv) randomly transmit data on predefined time slots. The proposed novel techniques and insights gained from this thesis aim to better utilize the limited energy resources of machine-type devices in order to effectively serve the future wireless networks. Firstly, we consider a sensor based MTC device communicates with a BS, and devise optimal data compression and transmission policies with an objective to prolong the device-lifetime. We formulate joint optimization problems aiming to maximize the device-lifetime whilst satisfying the delay and bit-error-rate constraints. Our results show significant improvement in device-lifetime. Importantly, the gain is most profound in the low latency regime. Secondly, we consider a sensor based MTC device that is served by a hybrid BS which wirelessly transfers power to the device and receives data transmission from the device. The MTC device employs data compression in order to reduce the energy cost of data transmission. Thus, we propose to jointly optimize the harvesting-time, compression and transmission design, to minimize the energy cost of the system under given delay constraint. The proposed scheme reduces energy consumption up to 19% when data compression is employed. Thirdly, we consider multiple MTC devices transmit data to a BS following the time division multiple access (TDMA). Conventionally, the energy-efficiency performance in TDMA is optimized through multi-user scheduling, i.e., changing the transmission time allocated to different devices. In such a system, the sequence of devices for transmission, i.e., who transmits first and who transmits second, etc., does not have any impact on the energy-efficiency. We consider that data compression is performed before transmission. We jointly optimize both multi-user sequencing and scheduling along with the compression and transmission rate. Our results show that multi-user sequence optimization achieves up to 45% improvement in the energy-efficiency at MTC devices. Lastly, we consider contention resolution diversity slotted ALOHA (CRDSA) with transmit power diversity where each packet copy from a device is transmitted at a randomly selected power level. It results in inter-slot received power diversity, which is exploited by employing a signal-to-interference-plus-noise ratio based successive interference cancellation (SIC) receiver. We propose a message passing algorithm to model the SIC decoding and formulate an optimization problem to determine the optimal transmit power distribution subject to energy constraints. We show that the proposed strategy provides up to 88% system load performance improvement for massive-MTC systems

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Approximations of the aggregated interference statistics for outage analysis in massive MTC

    Get PDF
    This paper presents several analytic closed-form approximations of the aggregated interference statistics within the framework of uplink massive machine-type-communications (mMTC), taking into account the random activity of the sensors. Given its discrete nature and the large number of devices involved, a continuous approximation based on the Gram–Charlier series expansion of a truncated Gaussian kernel is proposed. We use this approximation to derive an analytic closed-form expression for the outage probability, corresponding to the event of the signal-to-interference-and-noise ratio being below a detection threshold. This metric is useful since it can be used for evaluating the performance of mMTC systems. We analyze, as an illustrative application of the previous approximation, a scenario with several multi-antenna collector nodes, each equipped with a set of predefined spatial beams. We consider two setups, namely single- and multiple-resource, in reference to the number of resources that are allocated to each beam. A graph-based approach that minimizes the average outage probability, and that is based on the statistics approximation, is used as allocation strategy. Finally, we describe an access protocol where the resource identifiers are broadcast (distributed) through the beams. Numerical simulations prove the accuracy of the approximations and the benefits of the allocation strategy.Peer ReviewedPostprint (published version

    Optimizing resource allocation in eh-enabled internet of things

    Get PDF
    Internet of Things (IoT) aims to bridge everyday physical objects via the Internet. Traditional energy-constrained wireless devices are powered by fixed energy sources like batteries, but they may require frequent battery replacements or recharging. Wireless Energy Harvesting (EH), as a promising solution, can potentially eliminate the need of recharging or replacing the batteries. Unlike other types of green energy sources, wireless EH does not depend on nature and is thus a reliable source of energy for charging devices. Meanwhile, the rapid growth of IoT devices and wireless applications is likely to demand for more operating frequency bands. Although the frequency spectrum is currently scarce, owing to inefficient conventional regulatory policies, a considerable amount of the radio spectrum is greatly underutilized. Cognitive radio (CR) can be exploited to mitigate the spectrum scarcity problem of IoT applications by leveraging the spectrum holes. Therefore, transforming the IoT network into a cognitive based IoT network is essential to utilizing the available spectrum opportunistically. To address the two aforementioned issues, a novel model is proposed to leverage wireless EH and CR for IoT. In particular, the sum rate of users is maximized for a CR-based IoT network enabled with wireless EH. Users operate in a time switching fashion, and each time slot is partitioned into three non-overlapping parts devoted for EH, spectrum sensing and data transmission. There is a trade-off among the lengths of these three operations and thus the time slot structure is to be optimized. The general problem of joint resource allocation and EH optimization is formulated as a mixed integer nonlinear programming task which is NP-hard and intractable. Therefore, a sub-channel allocation scheme is first proposed to approximately satisfy users rate requirements and remove the integer constraints. In the second step, the general optimization problem is reduced to a convex optimization task. Another optimization framework is also designed to capture a fundamental tradeoff between energy efficiency (EE) and spectral efficiency for an EH-enabled IoT network. In particular, an EE maximization problem is formulated by taking into consideration of user buffer occupancy, data rate fairness, energy causality constraints and interference constraints. Then, a low complexity heuristic algorithm is proposed to solve the resource allocation and EE optimization problem. The proposed algorithm is shown to be capable of achieving a near optimal solution with polynomial complexity. To support Machine Type Communications (MTC) in next generation mobile networks, NarrowBand-IoT (NB-IoT) has emerged as a promising solution to provide extended coverage and low energy consumption for low cost MTC devices. However, the existing orthogonal multiple access scheme in NB-IoT cannot provide connectivity for a massive number of MTC devices. In parallel with the development of NB-IoT, Non-Orthogonal Multiple Access (NOMA), introduced for the fifth generation wireless networks, is deemed to significantly improve the network capacity by providing massive connectivity through sharing the same spectral resources. To leverage NOMA in the context of NB-IoT, a power domain NOMA scheme is proposed with user clustering for an NB-IoT system. In particular, the MTC devices are assigned to different ranks within the NOMA clusters where they transmit over the same frequency resources. Then, an optimization problem is formulated to maximize the total throughput of the network by optimizing the resource allocation of MTC devices and NOMA clustering while satisfying the transmission power and quality of service requirements. Furthermore, an efficient heuristic algorithm is designed to solve the proposed optimization problem by jointly optimizing NOMA clustering and resource allocation of MTC devices

    A Comprehensive Overview on 5G-and-Beyond Networks with UAVs: From Communications to Sensing and Intelligence

    Full text link
    Due to the advancements in cellular technologies and the dense deployment of cellular infrastructure, integrating unmanned aerial vehicles (UAVs) into the fifth-generation (5G) and beyond cellular networks is a promising solution to achieve safe UAV operation as well as enabling diversified applications with mission-specific payload data delivery. In particular, 5G networks need to support three typical usage scenarios, namely, enhanced mobile broadband (eMBB), ultra-reliable low-latency communications (URLLC), and massive machine-type communications (mMTC). On the one hand, UAVs can be leveraged as cost-effective aerial platforms to provide ground users with enhanced communication services by exploiting their high cruising altitude and controllable maneuverability in three-dimensional (3D) space. On the other hand, providing such communication services simultaneously for both UAV and ground users poses new challenges due to the need for ubiquitous 3D signal coverage as well as the strong air-ground network interference. Besides the requirement of high-performance wireless communications, the ability to support effective and efficient sensing as well as network intelligence is also essential for 5G-and-beyond 3D heterogeneous wireless networks with coexisting aerial and ground users. In this paper, we provide a comprehensive overview of the latest research efforts on integrating UAVs into cellular networks, with an emphasis on how to exploit advanced techniques (e.g., intelligent reflecting surface, short packet transmission, energy harvesting, joint communication and radar sensing, and edge intelligence) to meet the diversified service requirements of next-generation wireless systems. Moreover, we highlight important directions for further investigation in future work.Comment: Accepted by IEEE JSA

    Connectivity analysis in clustered wireless sensor networks powered by solar energy

    Get PDF
    ©2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Emerging 5G communication paradigms, such as machine-type communication, have triggered an explosion in ad-hoc applications that require connectivity among the nodes of wireless networks. Ensuring a reliable network operation under fading conditions is not straightforward, as the transmission schemes and the network topology, i.e., uniform or clustered deployments, affect the performance and should be taken into account. Moreover, as the number of nodes increases, exploiting natural energy sources and wireless energy harvesting (WEH) could be the key to the elimination of maintenance costs while also boosting immensely the network lifetime. In this way, zero-energy wireless-powered sensor networks (WPSNs) could be achieved, if all components are powered by green sources. Hence, designing accurate mathematical models that capture the network behavior under these circumstances is necessary to provide a deeper comprehension of such networks. In this paper, we provide an analytical model for the connectivity in a large-scale zero-energy clustered WPSN under two common transmission schemes, namely, unicast and broadcast. The sensors are WEH-enabled, while the network components are solar-powered and employ a novel energy allocation algorithm. In our results, we evaluate the tradeoffs among the various scenarios via extensive simulations and identify the conditions that yield a fully connected zero-energy WPSN.Peer ReviewedPostprint (author's final draft

    Wireless Energy Harvesting with Amplify-and-Forward Relaying and Link Adaptation under Imperfect Feedback Channel

    Get PDF
    Energy harvesting is an alternative approach to extend the lifetime of wireless communications and decrease energy consumption, which results in fewer carbon emissions from wireless networks. In this study, adaptive modulation with EH relay is proposed. A power splitting mechanism for EH relay is used. The relay harvests energy from the source and forwards the information to the destination. A genetic algorithm (GA) is applied for the optimisation of the power splitting ratio at the relays. Two scenarios are considered namely, perfect and imperfect feedback channels. Results show that the spectral efficiency (SE) degradation, which is due to an imperfect feedback channel, was approximately 14% for conventional relays. The use of energy harvesting results in a degradation in the performance of SE of approximately 19% in case of a perfect feedback channel. Finally, an increase in the number of energy harvesting relays enhances the SE by 22%
    • …
    corecore