1,449 research outputs found
Cross-layer scheduling and resource allocation for heterogeneous traffic in 3G LTE
3G long term evolution (LTE) introduces stringent needs in order to provide different kinds of traffic with Quality of Service (QoS) characteristics. The major problem with this nature of LTE is that it does not have any paradigm scheduling algorithm that will ideally control the assignment of resources which in turn will improve the user satisfaction. This has become an open subject and different scheduling algorithms have been proposed which are quite challenging and complex. To address this issue, in this paper, we investigate how our proposed algorithm improves the user satisfaction for heterogeneous traffic, that is, best-effort traffic such as file transfer protocol (FTP) and real-time traffic such as voice over internet protocol (VoIP). Our proposed algorithm is formulated using the cross-layer technique. The goal of our proposed algorithm is to maximize the expected total user satisfaction (total-utility) under different constraints. We compared our proposed algorithm with proportional fair (PF), exponential proportional fair (EXP-PF), and U-delay. Using simulations, our proposed algorithm improved the performance of real-time traffic based on throughput, VoIP delay, and VoIP packet loss ratio metrics while PF improved the performance of best-effort traffic based on FTP traffic received, FTP packet loss ratio, and FTP throughput metrics
A Delay-Optimal Packet Scheduler for M2M Uplink
In this paper, we present a delay-optimal packet scheduler for processing the
M2M uplink traffic at the M2M application server (AS). Due to the
delay-heterogeneity in uplink traffic, we classify it broadly into
delay-tolerant and delay-sensitive traffic. We then map the diverse delay
requirements of each class to sigmoidal functions of packet delay and formulate
a utility-maximization problem that results in a proportionally fair
delay-optimal scheduler. We note that solving this optimization problem is
equivalent to solving for the optimal fraction of time each class is served
with (preemptive) priority such that it maximizes the system utility. Using
Monte-Carlo simulations for the queuing process at AS, we verify the
correctness of the analytical result for optimal scheduler and show that it
outperforms other state-of-the-art packet schedulers such as weighted round
robin, max-weight scheduler, fair scheduler and priority scheduling. We also
note that at higher traffic arrival rate, the proposed scheduler results in a
near-minimal delay variance for the delay-sensitive traffic which is highly
desirable. This comes at the expense of somewhat higher delay variance for
delay-tolerant traffic which is usually acceptable due to its delay-tolerant
nature.Comment: Accepted for publication in IEEE MILCOM 2016 (6 pages, 7 figures
Evaluation, Modeling and Optimization of Coverage Enhancement Methods of NB-IoT
Narrowband Internet of Things (NB-IoT) is a new Low Power Wide Area Network
(LPWAN) technology released by 3GPP. The primary goals of NB-IoT are improved
coverage, massive capacity, low cost, and long battery life. In order to
improve coverage, NB-IoT has promising solutions, such as increasing
transmission repetitions, decreasing bandwidth, and adapting the Modulation and
Coding Scheme (MCS). In this paper, we present an implementation of coverage
enhancement features of NB-IoT in NS-3, an end-to-end network simulator. The
resource allocation and link adaptation in NS-3 are modified to comply with the
new features of NB-IoT. Using the developed simulation framework, the influence
of the new features on network reliability and latency is evaluated.
Furthermore, an optimal hybrid link adaptation strategy based on all three
features is proposed. To achieve this, we formulate an optimization problem
that has an objective function based on latency, and constraint based on the
Signal to Noise Ratio (SNR). Then, we propose several algorithms to minimize
latency and compare them with respect to accuracy and speed. The best hybrid
solution is chosen and implemented in the NS-3 simulator by which the latency
formulation is verified. The numerical results show that the proposed
optimization algorithm for hybrid link adaptation is eight times faster than
the exhaustive search approach and yields similar latency
Scheduling for Multi-Camera Surveillance in LTE Networks
Wireless surveillance in cellular networks has become increasingly important,
while commercial LTE surveillance cameras are also available nowadays.
Nevertheless, most scheduling algorithms in the literature are throughput,
fairness, or profit-based approaches, which are not suitable for wireless
surveillance. In this paper, therefore, we explore the resource allocation
problem for a multi-camera surveillance system in 3GPP Long Term Evolution
(LTE) uplink (UL) networks. We minimize the number of allocated resource blocks
(RBs) while guaranteeing the coverage requirement for surveillance systems in
LTE UL networks. Specifically, we formulate the Camera Set Resource Allocation
Problem (CSRAP) and prove that the problem is NP-Hard. We then propose an
Integer Linear Programming formulation for general cases to find the optimal
solution. Moreover, we present a baseline algorithm and devise an approximation
algorithm to solve the problem. Simulation results based on a real surveillance
map and synthetic datasets manifest that the number of allocated RBs can be
effectively reduced compared to the existing approach for LTE networks.Comment: 9 pages, 10 figure
- …