7,017 research outputs found
Ruin Theory for Dynamic Spectrum Allocation in LTE-U Networks
LTE in the unlicensed band (LTE-U) is a promising solution to overcome the
scarcity of the wireless spectrum. However, to reap the benefits of LTE-U, it
is essential to maintain its effective coexistence with WiFi systems. Such a
coexistence, hence, constitutes a major challenge for LTE-U deployment. In this
paper, the problem of unlicensed spectrum sharing among WiFi and LTE-U system
is studied. In particular, a fair time sharing model based on \emph{ruin
theory} is proposed to share redundant spectral resources from the unlicensed
band with LTE-U without jeopardizing the performance of the WiFi system.
Fairness among both WiFi and LTE-U is maintained by applying the concept of the
probability of ruin. In particular, the probability of ruin is used to perform
efficient duty-cycle allocation in LTE-U, so as to provide fairness to the WiFi
system and maintain certain WiFi performance. Simulation results show that the
proposed ruin-based algorithm provides better fairness to the WiFi system as
compared to equal duty-cycle sharing among WiFi and LTE-U.Comment: Accepted in IEEE Communications Letters (09-Dec 2018
Feedback Allocation For OFDMA Systems With Slow Frequency-domain Scheduling
We study the problem of allocating limited feedback resources across multiple
users in an orthogonal-frequency-division-multiple-access downlink system with
slow frequency-domain scheduling. Many flavors of slow frequency-domain
scheduling (e.g., persistent scheduling, semi-persistent scheduling), that
adapt user-sub-band assignments on a slower time-scale, are being considered in
standards such as 3GPP Long-Term Evolution. In this paper, we develop a
feedback allocation algorithm that operates in conjunction with any arbitrary
slow frequency-domain scheduler with the goal of improving the throughput of
the system. Given a user-sub-band assignment chosen by the scheduler, the
feedback allocation algorithm involves solving a weighted sum-rate maximization
at each (slow) scheduling instant. We first develop an optimal
dynamic-programming-based algorithm to solve the feedback allocation problem
with pseudo-polynomial complexity in the number of users and in the total
feedback bit budget. We then propose two approximation algorithms with
complexity further reduced, for scenarios where the problem exhibits additional
structure.Comment: Accepted to IEEE Transactions on Signal Processin
Delay-Optimal Relay Selection in Device-to-Device Communications for Smart Grid
The smart grid communication network adopts a hierarchical structure which consists of three kinds of networks which are Home Area Networks (HANs), Neighborhood Area Networks (NANs), and Wide Area Networks (WANs). The smart grid NANs comprise of the communication infrastructure used to manage the electricity distribution to the end users. Cellular technology with LTE-based standards is a widely-used and forward-looking technology hence becomes a promising technology that can meet the requirements of different applications in NANs. However, the LTE has a limitation to cope with the data traffic characteristics of smart grid applications, thus require for enhancements. Device-to-Device (D2D) communications enable direct data transmissions between devices by exploiting the cellular resources, which could guarantee the improvement of LTE performances. Delay is one of the important communication requirements for the real-time smart grid applications. In this paper, the application of D2D communications for the smart grid NANs is investigated to improve the average end-to-end delay of the system. A relay selection algorithm that considers both the queue state and the channel state of nodes is proposed. The optimization problem is formulated as a constrained Markov decision process (CMDP) and a linear programming method is used to find the optimal policy for the CMDP problem. Simulation results are presented to prove the effectiveness of the proposed scheme
Partially-Distributed Resource Allocation in Small-Cell Networks
We propose a four-stage hierarchical resource allocation scheme for the
downlink of a large-scale small-cell network in the context of orthogonal
frequency-division multiple access (OFDMA). Since interference limits the
capabilities of such networks, resource allocation and interference management
are crucial. However, obtaining the globally optimum resource allocation is
exponentially complex and mathematically intractable. Here, we develop a
partially decentralized algorithm to obtain an effective solution. The three
major advantages of our work are: 1) as opposed to a fixed resource allocation,
we consider load demand at each access point (AP) when allocating spectrum; 2)
to prevent overloaded APs, our scheme is dynamic in the sense that as the users
move from one AP to the other, so do the allocated resources, if necessary, and
such considerations generally result in huge computational complexity, which
brings us to the third advantage: 3) we tackle complexity by introducing a
hierarchical scheme comprising four phases: user association, load estimation,
interference management via graph coloring, and scheduling. We provide
mathematical analysis for the first three steps modeling the user and AP
locations as Poisson point processes. Finally, we provide results of numerical
simulations to illustrate the efficacy of our scheme.Comment: Accepted on May 15, 2014 for publication in the IEEE Transactions on
Wireless Communication
- …