17,340 research outputs found

    Distributed stochastic optimization via matrix exponential learning

    Get PDF
    In this paper, we investigate a distributed learning scheme for a broad class of stochastic optimization problems and games that arise in signal processing and wireless communications. The proposed algorithm relies on the method of matrix exponential learning (MXL) and only requires locally computable gradient observations that are possibly imperfect and/or obsolete. To analyze it, we introduce the notion of a stable Nash equilibrium and we show that the algorithm is globally convergent to such equilibria - or locally convergent when an equilibrium is only locally stable. We also derive an explicit linear bound for the algorithm's convergence speed, which remains valid under measurement errors and uncertainty of arbitrarily high variance. To validate our theoretical analysis, we test the algorithm in realistic multi-carrier/multiple-antenna wireless scenarios where several users seek to maximize their energy efficiency. Our results show that learning allows users to attain a net increase between 100% and 500% in energy efficiency, even under very high uncertainty.Comment: 31 pages, 3 figure

    Dynamic Packet Scheduling in Wireless Networks

    Full text link
    We consider protocols that serve communication requests arising over time in a wireless network that is subject to interference. Unlike previous approaches, we take the geometry of the network and power control into account, both allowing to increase the network's performance significantly. We introduce a stochastic and an adversarial model to bound the packet injection. Although taken as the primary motivation, this approach is not only suitable for models based on the signal-to-interference-plus-noise ratio (SINR). It also covers virtually all other common interference models, for example the multiple-access channel, the radio-network model, the protocol model, and distance-2 matching. Packet-routing networks allowing each edge or each node to transmit or receive one packet at a time can be modeled as well. Starting from algorithms for the respective scheduling problem with static transmission requests, we build distributed stable protocols. This is more involved than in previous, similar approaches because the algorithms we consider do not necessarily scale linearly when scaling the input instance. We can guarantee a throughput that is as large as the one of the original static algorithm. In particular, for SINR models the competitive ratios of the protocol in comparison to optimal ones in the respective model are between constant and O(log^2 m) for a network of size m.Comment: 23 page

    Feedback Allocation For OFDMA Systems With Slow Frequency-domain Scheduling

    Get PDF
    We study the problem of allocating limited feedback resources across multiple users in an orthogonal-frequency-division-multiple-access downlink system with slow frequency-domain scheduling. Many flavors of slow frequency-domain scheduling (e.g., persistent scheduling, semi-persistent scheduling), that adapt user-sub-band assignments on a slower time-scale, are being considered in standards such as 3GPP Long-Term Evolution. In this paper, we develop a feedback allocation algorithm that operates in conjunction with any arbitrary slow frequency-domain scheduler with the goal of improving the throughput of the system. Given a user-sub-band assignment chosen by the scheduler, the feedback allocation algorithm involves solving a weighted sum-rate maximization at each (slow) scheduling instant. We first develop an optimal dynamic-programming-based algorithm to solve the feedback allocation problem with pseudo-polynomial complexity in the number of users and in the total feedback bit budget. We then propose two approximation algorithms with complexity further reduced, for scenarios where the problem exhibits additional structure.Comment: Accepted to IEEE Transactions on Signal Processin

    Sparse Signal Processing Concepts for Efficient 5G System Design

    Full text link
    As it becomes increasingly apparent that 4G will not be able to meet the emerging demands of future mobile communication systems, the question what could make up a 5G system, what are the crucial challenges and what are the key drivers is part of intensive, ongoing discussions. Partly due to the advent of compressive sensing, methods that can optimally exploit sparsity in signals have received tremendous attention in recent years. In this paper we will describe a variety of scenarios in which signal sparsity arises naturally in 5G wireless systems. Signal sparsity and the associated rich collection of tools and algorithms will thus be a viable source for innovation in 5G wireless system design. We will discribe applications of this sparse signal processing paradigm in MIMO random access, cloud radio access networks, compressive channel-source network coding, and embedded security. We will also emphasize important open problem that may arise in 5G system design, for which sparsity will potentially play a key role in their solution.Comment: 18 pages, 5 figures, accepted for publication in IEEE Acces
    • …
    corecore