1,895 research outputs found

    Provisioning quality-of-service to energy harvesting wireless communications

    Get PDF
    Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Energy harvesting (EH) is an innovative way to build long-term and self-sustainable wireless networks. However, an inconstant EH rate may have an adverse effect on the quality-of-service (QoS) of wireless traffic, such as packet delay and error. In this article we discuss techniques that provide QoS to EH powered wireless communications. A new "dynamic string tautening" method is presented to produce the most energy efficient schedule with substantially lower complexity, compared to convex optimization techniques. The method adapts to the bursty arrivals of wireless traffic and harvested energy, and ensures that delay-sensitive data will be delivered by deadline. Comprehensive designs of EH powered transmitters are also discussed, where the EH rate, battery capacity, and deadline requirement can be jointly adjusted to leverage QoS and the cost.Peer reviewe

    Energy-Efficient Optimization for Wireless Information and Power Transfer in Large-Scale MIMO Systems Employing Energy Beamforming

    Full text link
    In this letter, we consider a large-scale multiple-input multiple-output (MIMO) system where the receiver should harvest energy from the transmitter by wireless power transfer to support its wireless information transmission. The energy beamforming in the large-scale MIMO system is utilized to address the challenging problem of long-distance wireless power transfer. Furthermore, considering the limitation of the power in such a system, this letter focuses on the maximization of the energy efficiency of information transmission (bit per Joule) while satisfying the quality-of-service (QoS) requirement, i.e. delay constraint, by jointly optimizing transfer duration and transmit power. By solving the optimization problem, we derive an energy-efficient resource allocation scheme. Numerical results validate the effectiveness of the proposed scheme.Comment: 4 pages, 3 figures. IEEE Wireless Communications Letters 201

    Online Learning for Offloading and Autoscaling in Energy Harvesting Mobile Edge Computing

    Full text link
    Mobile edge computing (a.k.a. fog computing) has recently emerged to enable in-situ processing of delay-sensitive applications at the edge of mobile networks. Providing grid power supply in support of mobile edge computing, however, is costly and even infeasible (in certain rugged or under-developed areas), thus mandating on-site renewable energy as a major or even sole power supply in increasingly many scenarios. Nonetheless, the high intermittency and unpredictability of renewable energy make it very challenging to deliver a high quality of service to users in energy harvesting mobile edge computing systems. In this paper, we address the challenge of incorporating renewables into mobile edge computing and propose an efficient reinforcement learning-based resource management algorithm, which learns on-the-fly the optimal policy of dynamic workload offloading (to the centralized cloud) and edge server provisioning to minimize the long-term system cost (including both service delay and operational cost). Our online learning algorithm uses a decomposition of the (offline) value iteration and (online) reinforcement learning, thus achieving a significant improvement of learning rate and run-time performance when compared to standard reinforcement learning algorithms such as Q-learning. We prove the convergence of the proposed algorithm and analytically show that the learned policy has a simple monotone structure amenable to practical implementation. Our simulation results validate the efficacy of our algorithm, which significantly improves the edge computing performance compared to fixed or myopic optimization schemes and conventional reinforcement learning algorithms.Comment: arXiv admin note: text overlap with arXiv:1701.01090 by other author

    GreenDelivery: Proactive Content Caching and Push with Energy-Harvesting-based Small Cells

    Full text link
    The explosive growth of mobile multimedia traffic calls for scalable wireless access with high quality of service and low energy cost. Motivated by the emerging energy harvesting communications, and the trend of caching multimedia contents at the access edge and user terminals, we propose a paradigm-shift framework, namely GreenDelivery, enabling efficient content delivery with energy harvesting based small cells. To resolve the two-dimensional randomness of energy harvesting and content request arrivals, proactive caching and push are jointly optimized, with respect to the content popularity distribution and battery states. We thus develop a novel way of understanding the interplay between content and energy over time and space. Case studies are provided to show the substantial reduction of macro BS activities, and thus the related energy consumption from the power grid is reduced. Research issues of the proposed GreenDelivery framework are also discussed.Comment: 15 pages, 5 figures, accepted by IEEE Communications Magazin

    Machine Learning in Wireless Sensor Networks: Algorithms, Strategies, and Applications

    Get PDF
    Wireless sensor networks monitor dynamic environments that change rapidly over time. This dynamic behavior is either caused by external factors or initiated by the system designers themselves. To adapt to such conditions, sensor networks often adopt machine learning techniques to eliminate the need for unnecessary redesign. Machine learning also inspires many practical solutions that maximize resource utilization and prolong the lifespan of the network. In this paper, we present an extensive literature review over the period 2002-2013 of machine learning methods that were used to address common issues in wireless sensor networks (WSNs). The advantages and disadvantages of each proposed algorithm are evaluated against the corresponding problem. We also provide a comparative guide to aid WSN designers in developing suitable machine learning solutions for their specific application challenges.Comment: Accepted for publication in IEEE Communications Surveys and Tutorial
    • …
    corecore