2,237 research outputs found
How much feedback is required in MIMO Broadcast Channels?
In this paper, a downlink communication system, in which a Base Station (BS)
equipped with M antennas communicates with N users each equipped with K receive
antennas (), is considered. It is assumed that the receivers have
perfect Channel State Information (CSI), while the BS only knows the partial
CSI, provided by the receivers via feedback. The minimum amount of feedback
required at the BS, to achieve the maximum sum-rate capacity in the asymptotic
case of and different ranges of SNR is studied. In the fixed and
low SNR regimes, it is demonstrated that to achieve the maximum sum-rate, an
infinite amount of feedback is required. Moreover, in order to reduce the gap
to the optimum sum-rate to zero, in the fixed SNR regime, the minimum amount of
feedback scales as , which is achievable by the Random
Beam-Forming scheme proposed in [14]. In the high SNR regime, two cases are
considered; in the case of , it is proved that the minimum amount of
feedback bits to reduce the gap between the achievable sum-rate and the maximum
sum-rate to zero grows logaritmically with SNR, which is achievable by the
"Generalized Random Beam-Forming" scheme, proposed in [18]. In the case of , it is shown that by using the Random Beam-Forming scheme and the total
amount of feedback not growing with SNR, the maximum sum-rate capacity is
achieved.Comment: Submitted to IEEE Trans. on Inform. Theor
Minimal redefinition of the OSV ensemble
In the interesting conjecture, Z_{BH} = |Z_{top}|^2, proposed by Ooguri,
Strominger and Vafa (OSV), the black hole ensemble is a mixed ensemble and the
resulting degeneracy of states, as obtained from the ensemble inverse-Laplace
integration, suffers from prefactors which do not respect the electric-magnetic
duality. One idea to overcome this deficiency, as claimed recently, is imposing
nontrivial measures for the ensemble sum. We address this problem and upon a
redefinition of the OSV ensemble whose variables are as numerous as the
electric potentials, show that for restoring the symmetry no non-Euclidean
measure is needful. In detail, we rewrite the OSV free energy as a function of
new variables which are combinations of the electric-potentials and the black
hole charges. Subsequently the Legendre transformation which bridges between
the entropy and the black hole free energy in terms of these variables, points
to a generalized ensemble. In this context, we will consider all the cases of
relevance: small and large black holes, with or without D_6-brane charge. For
the case of vanishing D_6-brane charge, the new ensemble is pure canonical and
the electric-magnetic duality is restored exactly, leading to proper results
for the black hole degeneracy of states. For more general cases, the
construction still works well as far as the violation of the duality by the
corresponding OSV result is restricted to a prefactor. In a concrete example we
shall show that for black holes with non-vanishing D_6-brane charge, there are
cases where the duality violation goes beyond this restriction, thus imposing
non-trivial measures is incapable of restoring the duality. This observation
signals for a deeper modification in the OSV proposal.Comment: 23 pages, v2: minor change
On the Delay-Throughput Tradeoff in Distributed Wireless Networks
This paper deals with the delay-throughput analysis of a single-hop wireless
network with transmitter/receiver pairs. All channels are assumed to be
block Rayleigh fading with shadowing, described by parameters
, where denotes the probability of shadowing and
represents the average cross-link gains. The analysis relies on the
distributed on-off power allocation strategy (i.e., links with a direct channel
gain above a certain threshold transmit at full power and the rest remain
silent) for the deterministic and stochastic packet arrival processes. It is
also assumed that each transmitter has a buffer size of one packet and dropping
occurs once a packet arrives in the buffer while the previous packet has not
been served. In the first part of the paper, we define a new notion of
performance in the network, called effective throughput, which captures the
effect of arrival process in the network throughput, and maximize it for
different cases of packet arrival process. It is proved that the effective
throughput of the network asymptotically scales as , with , regardless of
the packet arrival process. In the second part of the paper, we present the
delay characteristics of the underlying network in terms of the packet dropping
probability. We derive the sufficient conditions in the asymptotic case of such that the packet dropping probability tend to zero, while
achieving the maximum effective throughput of the network. Finally, we study
the trade-off between the effective throughput, delay, and packet dropping
probability of the network for different packet arrival processes.Comment: Submitted to IEEE Transactions on Information Theory (34 pages
Scheduling and Codeword Length Optimization in Time Varying Wireless Networks
In this paper, a downlink scenario in which a single-antenna base station
communicates with K single antenna users, over a time-correlated fading
channel, is considered. It is assumed that channel state information is
perfectly known at each receiver, while the statistical characteristics of the
fading process and the fading gain at the beginning of each frame are known to
the transmitter. By evaluating the random coding error exponent of the
time-correlated fading channel, it is shown that there is an optimal codeword
length which maximizes the throughput. The throughput of the conventional
scheduling that transmits to the user with the maximum signal to noise ratio is
examined using both fixed length codewords and variable length codewords.
Although optimizing the codeword length improves the performance, it is shown
that using the conventional scheduling, the gap between the achievable
throughput and the maximum possible throughput of the system tends to infinity
as K goes to infinity. A simple scheduling that considers both the signal to
noise ratio and the channel time variation is proposed. It is shown that by
using this scheduling, the gap between the achievable throughput and the
maximum throughput of the system approaches zero
- …