898 research outputs found
Scheduling and Codeword Length Optimization in Time Varying Wireless Networks
In this paper, a downlink scenario in which a single-antenna base station
communicates with K single antenna users, over a time-correlated fading
channel, is considered. It is assumed that channel state information is
perfectly known at each receiver, while the statistical characteristics of the
fading process and the fading gain at the beginning of each frame are known to
the transmitter. By evaluating the random coding error exponent of the
time-correlated fading channel, it is shown that there is an optimal codeword
length which maximizes the throughput. The throughput of the conventional
scheduling that transmits to the user with the maximum signal to noise ratio is
examined using both fixed length codewords and variable length codewords.
Although optimizing the codeword length improves the performance, it is shown
that using the conventional scheduling, the gap between the achievable
throughput and the maximum possible throughput of the system tends to infinity
as K goes to infinity. A simple scheduling that considers both the signal to
noise ratio and the channel time variation is proposed. It is shown that by
using this scheduling, the gap between the achievable throughput and the
maximum throughput of the system approaches zero
Wireless Network Control with Privacy Using Hybrid ARQ
We consider the problem of resource allocation in a wireless cellular
network, in which nodes have both open and private information to be
transmitted to the base station over block fading uplink channels. We develop a
cross-layer solution, based on hybrid ARQ transmission with incremental
redundancy. We provide a scheme that combines power control, flow control, and
scheduling in order to maximize a global utility function, subject to the
stability of the data queues, an average power constraint, and a constraint on
the privacy outage probability. Our scheme is based on the assumption that each
node has an estimate of its uplink channel gain at each block, while only the
distribution of the cross channel gains is available. We prove that our scheme
achieves a utility, arbitrarily close to the maximum achievable utility given
the available channel state information
On Capacity and Optimal Scheduling for the Half-Duplex Multiple-Relay Channel
We study the half-duplex multiple-relay channel (HD-MRC) where every node can
either transmit or listen but cannot do both at the same time. We obtain a
capacity upper bound based on a max-flow min-cut argument and achievable
transmission rates based on the decode-forward (DF) coding strategy, for both
the discrete memoryless HD-MRC and the phase-fading HD-MRC. We discover that
both the upper bound and the achievable rates are functions of the
transmit/listen state (a description of which nodes transmit and which
receive). More precisely, they are functions of the time fraction of the
different states, which we term a schedule. We formulate the optimal scheduling
problem to find an optimal schedule that maximizes the DF rate. The optimal
scheduling problem turns out to be a maximin optimization, for which we propose
an algorithmic solution. We demonstrate our approach on a four-node
multiple-relay channel, obtaining closed-form solutions in certain scenarios.
Furthermore, we show that for the received signal-to-noise ratio degraded
phase-fading HD-MRC, the optimal scheduling problem can be simplified to a max
optimization.Comment: Author's final version (to appear in IEEE Transactions on Information
Theory
Complexity-Aware Scheduling for an LDPC Encoded C-RAN Uplink
Centralized Radio Access Network (C-RAN) is a new paradigm for wireless
networks that centralizes the signal processing in a computing cloud, allowing
commodity computational resources to be pooled. While C-RAN improves
utilization and efficiency, the computational load occasionally exceeds the
available resources, creating a computational outage. This paper provides a
mathematical characterization of the computational outage probability for
low-density parity check (LDPC) codes, a common class of error-correcting
codes. For tractability, a binary erasures channel is assumed. Using the
concept of density evolution, the computational demand is determined for a
given ensemble of codes as a function of the erasure probability. The analysis
reveals a trade-off: aggressively signaling at a high rate stresses the
computing pool, while conservatively backing-off the rate can avoid
computational outages. Motivated by this trade-off, an effective
computationally aware scheduling algorithm is developed that balances demands
for high throughput and low outage rates.Comment: Conference on Information Sciences and Systems (CISS) 2017, to appea
Energy Harvesting Wireless Communications: A Review of Recent Advances
This article summarizes recent contributions in the broad area of energy
harvesting wireless communications. In particular, we provide the current state
of the art for wireless networks composed of energy harvesting nodes, starting
from the information-theoretic performance limits to transmission scheduling
policies and resource allocation, medium access and networking issues. The
emerging related area of energy transfer for self-sustaining energy harvesting
wireless networks is considered in detail covering both energy cooperation
aspects and simultaneous energy and information transfer. Various potential
models with energy harvesting nodes at different network scales are reviewed as
well as models for energy consumption at the nodes.Comment: To appear in the IEEE Journal of Selected Areas in Communications
(Special Issue: Wireless Communications Powered by Energy Harvesting and
Wireless Energy Transfer
Complexity aware C-RAN scheduling for LDPC codes over BEC
Effective transmission of data over a noisy wireless channel is a vital part of today\u27s high speed technology driven society. In a wireless cell network, information is sent from mobile users to base stations. The information being transmitted is protected by error-control codes. In a conventional architecture the signal processing, including error-control decoding, is performed locally at each base station. Recently, a new architecture has emerged called Centralized Radio Access Network (C-RAN), which involves the centralized processing of the signals in a computing cloud. Using a computing cloud allows computational resources to be pooled, which improves utilization and efficiency. When the computational resources are finite and when the computational load varies over time, then there is a chance that the load exceeds the available resources. This situation creates a so-called computational outage, which has characteristics that are similar to outages caused by channel fading or interference. In this report, the computational complexity is quantified for a common class of error-correcting codes known as low-density parity check (LDPC) codes. To make the analysis tractable, a binary erasure channel is assumed. The concept of density evolution is used to obtain the complexity as a function of the code design parameters and the signal-to-interference-plus-noise ratio (SINR) of the channel. The analysis shows that there is a trade-off in that aggressively signaling at a high data rate causes high computational demands, while conservatively backing off on the rate can dramatically reduce the computational demand. Motivated by this trade-off, a scheduling algorithm is developed that balances the demands for high throughput and low computational outage rates
- …