1,202 research outputs found
Cellular system information capacity change at higher frequencies due to propagation loss and system parameters
In this paper, mathematical analysis supported by computer simulation is used to study cellular system information capacity change due to propagation loss and system parameters (such as path loss exponent, shadowing and antenna height) at microwave carrier frequencies greater than 2 GHz and smaller cell size radius. An improved co-channel interference model, which includes the second tier co-channel interfering cells is used for the analysis. The system performance is measured in terms of the uplink information capacity of a time-division multiple access (TDMA) based cellular wireless system. The analysis and simulation results show that the second tier co-channel interfering cells become active at higher microwave carrier frequencies and smaller cell size radius. The results show that for both distance-dependent: path loss, shadowing and effective road height the uplink information capacity of the cellular wireless system decreases as carrier frequency increases and cell size radius R decreases. For example at a carrier frequency fc = 15.75 GHz, basic path loss
exponent α = 2 and cell size radius R = 100, 500 and 1000m the decrease in information capacity was 20, 5.29 and 2.68%
Expanding cellular coverage via cell-edge deployment in heterogeneous networks: spectral efficiency and backhaul power consumption perspectives
Heterogeneous small-cell networks (HetNets) are considered to be a standard part of future mobile networks where operator/consumer deployed small-cells, such as femtocells, relays, and distributed antennas (DAs), complement the existing macrocell infrastructure. This article proposes the need-oriented deployment of smallcells and device-to-device (D2D) communication around the edge of the macrocell such that the small-cell base stations (SBSs) and D2D communication serve the cell-edge mobile users, thereby expanding the network coverage and capacity. In this context, we present competitive network configurations, namely, femto-on-edge, DA-onedge, relay-on-edge, and D2D-communication on- edge, where femto base stations, DA elements, relay base stations, and D2D communication, respectively, are deployed around the edge of the macrocell. The proposed deployments ensure performance gains in the network in terms of spectral efficiency and power consumption by facilitating the cell-edge mobile users with small-cells and D2D communication. In order to calibrate the impact of power consumption on system performance and network topology, this article discusses the detailed breakdown of the end-to-end power consumption, which includes backhaul, access, and aggregation network power consumptions. Several comparative simulation results quantify the improvements in spectral efficiency and power consumption of the D2D-communication-onedge configuration to establish a greener network over the other competitive configurations
Downlink and Uplink Decoupling: a Disruptive Architectural Design for 5G Networks
Cell association in cellular networks has traditionally been based on the
downlink received signal power only, despite the fact that up and downlink
transmission powers and interference levels differed significantly. This
approach was adequate in homogeneous networks with macro base stations all
having similar transmission power levels. However, with the growth of
heterogeneous networks where there is a big disparity in the transmit power of
the different base station types, this approach is highly inefficient. In this
paper, we study the notion of Downlink and Uplink Decoupling (DUDe) where the
downlink cell association is based on the downlink received power while the
uplink is based on the pathloss. We present the motivation and assess the gains
of this 5G design approach with simulations that are based on Vodafone's LTE
field trial network in a dense urban area, employing a high resolution
ray-tracing pathloss prediction and realistic traffic maps based on live
network measurements.Comment: 6 pages, 7 figures, conference paper, submitted to IEEE GLOBECOM 201
Interference Management Based on RT/nRT Traffic Classification for FFR-Aided Small Cell/Macrocell Heterogeneous Networks
Cellular networks are constantly lagging in terms of the bandwidth needed to
support the growing high data rate demands. The system needs to efficiently
allocate its frequency spectrum such that the spectrum utilization can be
maximized while ensuring the quality of service (QoS) level. Owing to the
coexistence of different types of traffic (e.g., real-time (RT) and
non-real-time (nRT)) and different types of networks (e.g., small cell and
macrocell), ensuring the QoS level for different types of users becomes a
challenging issue in wireless networks. Fractional frequency reuse (FFR) is an
effective approach for increasing spectrum utilization and reducing
interference effects in orthogonal frequency division multiple access networks.
In this paper, we propose a new FFR scheme in which bandwidth allocation is
based on RT/nRT traffic classification. We consider the coexistence of small
cells and macrocells. After applying FFR technique in macrocells, the remaining
frequency bands are efficiently allocated among the small cells overlaid by a
macrocell. In our proposed scheme, total frequency-band allocations for
different macrocells are decided on the basis of the traffic intensity. The
transmitted power levels for different frequency bands are controlled based on
the level of interference from a nearby frequency band. Frequency bands with a
lower level of interference are assigned to the RT traffic to ensure a higher
QoS level for the RT traffic. RT traffic calls in macrocell networks are also
given a higher priority compared with nRT traffic calls to ensure the low
call-blocking rate. Performance analyses show significant improvement under the
proposed scheme compared with conventional FFR schemes
Separation Framework: An Enabler for Cooperative and D2D Communication for Future 5G Networks
Soaring capacity and coverage demands dictate that future cellular networks
need to soon migrate towards ultra-dense networks. However, network
densification comes with a host of challenges that include compromised energy
efficiency, complex interference management, cumbersome mobility management,
burdensome signaling overheads and higher backhaul costs. Interestingly, most
of the problems, that beleaguer network densification, stem from legacy
networks' one common feature i.e., tight coupling between the control and data
planes regardless of their degree of heterogeneity and cell density.
Consequently, in wake of 5G, control and data planes separation architecture
(SARC) has recently been conceived as a promising paradigm that has potential
to address most of aforementioned challenges. In this article, we review
various proposals that have been presented in literature so far to enable SARC.
More specifically, we analyze how and to what degree various SARC proposals
address the four main challenges in network densification namely: energy
efficiency, system level capacity maximization, interference management and
mobility management. We then focus on two salient features of future cellular
networks that have not yet been adapted in legacy networks at wide scale and
thus remain a hallmark of 5G, i.e., coordinated multipoint (CoMP), and
device-to-device (D2D) communications. After providing necessary background on
CoMP and D2D, we analyze how SARC can particularly act as a major enabler for
CoMP and D2D in context of 5G. This article thus serves as both a tutorial as
well as an up to date survey on SARC, CoMP and D2D. Most importantly, the
article provides an extensive outlook of challenges and opportunities that lie
at the crossroads of these three mutually entangled emerging technologies.Comment: 28 pages, 11 figures, IEEE Communications Surveys & Tutorials 201
- …