8,414 research outputs found
Packet Scheduling Study for Heterogeneous Traffic in Downlink 3GPP LTE System
Long Term Evolution (LTE) network deploys Orthogonal Frequency Division Multiple Access (OFDMA) technology for downlink multi-carrier transmission. To meet the Quality of Service (QoS) requirements for LTE networks, packet scheduling has been employed. Packet scheduling determines when and how the user’s packets are transmitted to the receiver. Therefore effective design of packet scheduling algorithm is an important discussion. The aims of packet scheduling are maximizing system throughput, guaranteeing fairness among users, andminimizing either or both PacketLoss Ratio (PLR)and packet delay. Inthis paper, the performance of two packet scheduling algorithms namely Log Maximum-Largest Weighted Delay First (LOG-MLWDF) and Max Delay Unit (MDU), developed for OFDM(Orthogonal Frequency Division Multiplexing)networks, has been investigated in LTE downlink networks, and acomparison of those algorithmswith a well-known scheduling algorithm namely Maximum-Largest Weighted Delay First(MLWDF) has been studied.The performance evaluation was in terms of system throughput, PLR and fairness index. This study was performed forboth real time (voice and video streaming)and non-real time (best effort)perspectives. Results show that for streaming flows,LOG-MLWDF shows best PLR performance among the considered scheduling schemes, and for best effort flows, it outperforms theother two algorithms in terms of packet delay and throughput
5GNOW: Challenging the LTE Design Paradigms of Orthogonality and Synchronicity
LTE and LTE-Advanced have been optimized to deliver high bandwidth pipes to
wireless users. The transport mechanisms have been tailored to maximize single
cell performance by enforcing strict synchronism and orthogonality within a
single cell and within a single contiguous frequency band. Various emerging
trends reveal major shortcomings of those design criteria: 1) The fraction of
machine-type-communications (MTC) is growing fast. Transmissions of this kind
are suffering from the bulky procedures necessary to ensure strict synchronism.
2) Collaborative schemes have been introduced to boost capacity and coverage
(CoMP), and wireless networks are becoming more and more heterogeneous
following the non-uniform distribution of users. Tremendous efforts must be
spent to collect the gains and to manage such systems under the premise of
strict synchronism and orthogonality. 3) The advent of the Digital Agenda and
the introduction of carrier aggregation are forcing the transmission systems to
deal with fragmented spectrum. 5GNOW is an European research project supported
by the European Commission within FP7 ICT Call 8. It will question the design
targets of LTE and LTE-Advanced having these shortcomings in mind and the
obedience to strict synchronism and orthogonality will be challenged. It will
develop new PHY and MAC layer concepts being better suited to meet the upcoming
needs with respect to service variety and heterogeneous transmission setups.
Wireless transmission networks following the outcomes of 5GNOW will be better
suited to meet the manifoldness of services, device classes and transmission
setups present in envisioned future scenarios like smart cities. The
integration of systems relying heavily on MTC into the communication network
will be eased. The per-user experience will be more uniform and satisfying. To
ensure this 5GNOW will contribute to upcoming 5G standardization.Comment: Submitted to Workshop on Mobile and Wireless Communication Systems
for 2020 and beyond (at IEEE VTC 2013, Spring
Goodbye, ALOHA!
©2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.The vision of the Internet of Things (IoT) to interconnect and Internet-connect everyday people, objects, and machines poses new challenges in the design of wireless communication networks. The design of medium access control (MAC) protocols has been traditionally an intense area of research due to their high impact on the overall performance of wireless communications. The majority of research activities in this field deal with different variations of protocols somehow based on ALOHA, either with or without listen before talk, i.e., carrier sensing multiple access. These protocols operate well under low traffic loads and low number of simultaneous devices. However, they suffer from congestion as the traffic load and the number of devices increase. For this reason, unless revisited, the MAC layer can become a bottleneck for the success of the IoT. In this paper, we provide an overview of the existing MAC solutions for the IoT, describing current limitations and envisioned challenges for the near future. Motivated by those, we identify a family of simple algorithms based on distributed queueing (DQ), which can operate for an infinite number of devices generating any traffic load and pattern. A description of the DQ mechanism is provided and most relevant existing studies of DQ applied in different scenarios are described in this paper. In addition, we provide a novel performance evaluation of DQ when applied for the IoT. Finally, a description of the very first demo of DQ for its use in the IoT is also included in this paper.Peer ReviewedPostprint (author's final draft
Massive Non-Orthogonal Multiple Access for Cellular IoT: Potentials and Limitations
The Internet of Things (IoT) promises ubiquitous connectivity of everything
everywhere, which represents the biggest technology trend in the years to come.
It is expected that by 2020 over 25 billion devices will be connected to
cellular networks; far beyond the number of devices in current wireless
networks. Machine-to-Machine (M2M) communications aims at providing the
communication infrastructure for enabling IoT by facilitating the billions of
multi-role devices to communicate with each other and with the underlying data
transport infrastructure without, or with little, human intervention. Providing
this infrastructure will require a dramatic shift from the current protocols
mostly designed for human-to-human (H2H) applications. This article reviews
recent 3GPP solutions for enabling massive cellular IoT and investigates the
random access strategies for M2M communications, which shows that cellular
networks must evolve to handle the new ways in which devices will connect and
communicate with the system. A massive non-orthogonal multiple access (NOMA)
technique is then presented as a promising solution to support a massive number
of IoT devices in cellular networks, where we also identify its practical
challenges and future research directions.Comment: To appear in IEEE Communications Magazin
V2X Meets NOMA: Non-Orthogonal Multiple Access for 5G Enabled Vehicular Networks
Benefited from the widely deployed infrastructure, the LTE network has
recently been considered as a promising candidate to support the
vehicle-to-everything (V2X) services. However, with a massive number of devices
accessing the V2X network in the future, the conventional OFDM-based LTE
network faces the congestion issues due to its low efficiency of orthogonal
access, resulting in significant access delay and posing a great challenge
especially to safety-critical applications. The non-orthogonal multiple access
(NOMA) technique has been well recognized as an effective solution for the
future 5G cellular networks to provide broadband communications and massive
connectivity. In this article, we investigate the applicability of NOMA in
supporting cellular V2X services to achieve low latency and high reliability.
Starting with a basic V2X unicast system, a novel NOMA-based scheme is proposed
to tackle the technical hurdles in designing high spectral efficient scheduling
and resource allocation schemes in the ultra dense topology. We then extend it
to a more general V2X broadcasting system. Other NOMA-based extended V2X
applications and some open issues are also discussed.Comment: Accepted by IEEE Wireless Communications Magazin
Two-Layered Superposition of Broadcast/Multicast and Unicast Signals in Multiuser OFDMA Systems
We study optimal delivery strategies of one common and independent
messages from a source to multiple users in wireless environments. In
particular, two-layered superposition of broadcast/multicast and unicast
signals is considered in a downlink multiuser OFDMA system. In the literature
and industry, the two-layer superposition is often considered as a pragmatic
approach to make a compromise between the simple but suboptimal orthogonal
multiplexing (OM) and the optimal but complex fully-layered non-orthogonal
multiplexing. In this work, we show that only two-layers are necessary to
achieve the maximum sum-rate when the common message has higher priority than
the individual unicast messages, and OM cannot be sum-rate optimal in
general. We develop an algorithm that finds the optimal power allocation over
the two-layers and across the OFDMA radio resources in static channels and a
class of fading channels. Two main use-cases are considered: i) Multicast and
unicast multiplexing when users with uplink capabilities request both
common and independent messages, and ii) broadcast and unicast multiplexing
when the common message targets receive-only devices and users with uplink
capabilities additionally request independent messages. Finally, we develop a
transceiver design for broadcast/multicast and unicast superposition
transmission based on LTE-A-Pro physical layer and show with numerical
evaluations in mobile environments with multipath propagation that the capacity
improvements can be translated into significant practical performance gains
compared to the orthogonal schemes in the 3GPP specifications. We also analyze
the impact of real channel estimation and show that significant gains in terms
of spectral efficiency or coverage area are still available even with
estimation errors and imperfect interference cancellation for the two-layered
superposition system
Advanced Coordinated Beamforming for the Downlink of Future LTE Cellular Networks
Modern cellular networks in traditional frequency bands are notoriously
interference-limited especially in urban areas, where base stations are
deployed in close proximity to one another. The latest releases of Long Term
Evolution (LTE) incorporate features for coordinating downlink transmissions as
an efficient means of managing interference. Recent field trial results and
theoretical studies of the performance of joint transmission (JT) coordinated
multi-point (CoMP) schemes revealed, however, that their gains are not as high
as initially expected, despite the large coordination overhead. These schemes
are known to be very sensitive to defects in synchronization or information
exchange between coordinating bases stations as well as uncoordinated
interference. In this article, we review recent advanced coordinated
beamforming (CB) schemes as alternatives, requiring less overhead than JT CoMP
while achieving good performance in realistic conditions. By stipulating that,
in certain LTE scenarios of increasing interest, uncoordinated interference
constitutes a major factor in the performance of CoMP techniques at large, we
hereby assess the resilience of the state-of-the-art CB to uncoordinated
interference. We also describe how these techniques can leverage the latest
specifications of current cellular networks, and how they may perform when we
consider standardized feedback and coordination. This allows us to identify
some key roadblocks and research directions to address as LTE evolves towards
the future of mobile communications.Comment: 16 pages, 6 figures, accepted to IEEE Communications Magazin
- …