637 research outputs found

    Will SDN be part of 5G?

    Get PDF
    For many, this is no longer a valid question and the case is considered settled with SDN/NFV (Software Defined Networking/Network Function Virtualization) providing the inevitable innovation enablers solving many outstanding management issues regarding 5G. However, given the monumental task of softwarization of radio access network (RAN) while 5G is just around the corner and some companies have started unveiling their 5G equipment already, the concern is very realistic that we may only see some point solutions involving SDN technology instead of a fully SDN-enabled RAN. This survey paper identifies all important obstacles in the way and looks at the state of the art of the relevant solutions. This survey is different from the previous surveys on SDN-based RAN as it focuses on the salient problems and discusses solutions proposed within and outside SDN literature. Our main focus is on fronthaul, backward compatibility, supposedly disruptive nature of SDN deployment, business cases and monetization of SDN related upgrades, latency of general purpose processors (GPP), and additional security vulnerabilities, softwarization brings along to the RAN. We have also provided a summary of the architectural developments in SDN-based RAN landscape as not all work can be covered under the focused issues. This paper provides a comprehensive survey on the state of the art of SDN-based RAN and clearly points out the gaps in the technology.Comment: 33 pages, 10 figure

    Techno-economical Analysis of Indoor Enterprise Solutions

    Get PDF

    Millimetre-Wave Fibre-Wireless Technologies for 5G Mobile Fronthaul

    Get PDF
    The unprecedented growth in mobile data traffic, driven primarily by bandwidth rich applications and high definition video is accelerating the development of fifth generation (5G) mobile network. As mobile access network evolves towards centralisation, mobile fronthaul (MFH) architecture becomes essential in providing high capacity, ubiquitous and yet affordable services to subscribers. In order to meet the demand for high data rates in the access, Millimetre-wave (mmWave) has been highlighted as an essential technology in the development of 5G-new radio (5G-NR). In the present MFH architecture which is typically based on common public radio interface (CPRI) protocol, baseband signals are digitised before fibre transmission, featuring high overhead data and stringent synchronisation requirements. A direct application of mmWave 5G-NR to CPRI digital MFH, where signal bandwidth is expected to be up to 1GHz will be challenging, due to the increased complexity of the digitising interface and huge overhead data that will be required for such bandwidth. Alternatively, radio over fibre (RoF) technique can be employed in the transportation of mmWave wireless signals via the MFH link, thereby avoiding the expensive digitisation interface and excessive overhead associated with its implementation. Additionally, mmWave carrier can be realised with the aid of photonic components employed in the RoF link, further reducing the system complexity. However, noise and nonlinearities inherent to analog transmission presents implementation challenges, limiting the system dynamic range. Therefore, it is important to investigate the effects of these impairments in RoF based MFH architecture. This thesis presents extensive research on the impact of noise and nonlinearities on 5G candidate waveforms, in mmWave 5G fibre wireless MFH. Besides orthogonal frequency division multiplexing (OFDM), another radio access technology (RAT) that has received significant attention is filter bank multicarrier (FBMC), particularly due to its high spectral containment and excellent performance in asynchronous transmission. Hence, FBMC waveform is adopted in this work to study the impact of noise and nonlinearities on the mmWave fibre-wireless MFH architecture. Since OFDM is widely deployed and it has been adopted for 5G-NR, the performance of OFDM and FBMC based 5G mmWave RAT in fibre wireless MFH architecture is compared for several implementations and transmission scenarios. To this extent, an end to end transmission testbed is designed and implemented using industry standard VPI Transmission Maker® to investigate five mmWave upconversion techniques. Simulation results show that the impact of noise is higher in FBMC when the signal to-noise (SNR) is low, however, FBMC exhibits better performance compared to OFDM as the SNR improved. More importantly, an evaluation of the contribution of each noise component to the overall system SNR is carried out. It is observed in the investigation that noise contribution from the optical carriers employed in the heterodyne upconversion of intermediate frequency (IF) signals to mmWave frequency dominate the system noise. An adaptive modulation technique is employed to optimise the system throughput based on the received SNR. The throughput of FBMC based system reduced significantly compared to OFDM, due to laser phase noise and chromatic dispersion (CD). Additionally, it is shown that by employing frequency domain averaging technique to enhance the channel estimation (CE), the throughput of FBMC is significantly increased and consequently, a comparable performance is obtained for both waveforms. Furthermore, several coexistence scenarios for multi service transmission are studied, considering OFDM and FBMC based RATs to evaluate the impact inter band interference (IBI), due to power amplifier (PA) nonlinearity on the system performance. The low out of band (OOB) emission in FBMC plays an important role in minimising IBI to adjacent services. Therefore, FBMC requires less guardband in coexistence with multiple services in 5G fibre-wireless MFH. Conversely, OFDM introduced significant OOB to adjacent services requiring large guardband in multi-service coexistence transmission scenario. Finally, a novel transmission scheme is proposed and investigated to simultaneously generate multiple mmWave signals using laser heterodyning mmWave upconversion technique. With appropriate IF and optical frequency plan, several mmWave signals can be realised. Simulation results demonstrate successful simultaneous realisation of 28GHz, 38GHz, and 60GHz mmWave signals

    Efficient energy management in ultra-dense wireless networks

    Get PDF
    The increase in demand for more network capacity has led to the evolution of wireless networks from being largely Heterogeneous (Het-Nets) to the now existing Ultra-dense (UDNs). In UDNs, small cells are densely deployed with the goal of shortening the physical distance between the base stations (BSs) and the UEs, so as to support more user equipment (UEs) at peak times while ensuring high data rates. Compared to Het-Nets, Ultra-dense networks (UDNs) have many advantages. These include, more network capacity, higher flexibility to routine configurations, and more suitability to achieve load-balancing, hence, fewer blind spots as well as lower call blocking probability. It should be noted that, in practice, due to the high density of deployed small cells in Ultra-Dense Networks, a number of issues, or rather concerns, come with this evolution from Het-Nets. Among these issues include problems with efficient radio resource management, user cell association, inter- and intra-cell interference management and, last but not least, efficient energy consumption. Some of these issues which impact the overall network efficiency are largely due to the use of obsolete algorithms, especially those whose resource allocation is based solely on received signal power (RSSP). In this paper, the focus is solely on the efficient energy management dilemma and how to optimally reduce the overall network energy consumption. Through an extensive literature review, a detailed report into the growing concern of efficient energy management in UDNs is provided in Chapter 2. The literature review report highlights the classification as well as the evolution of some of the Mobile Wireless Technologies and Mobile Wireless Networks in general. The literature review report provides reasons as to why the energy consumption issue has become a very serious concern in UltraDense networks as well as the various techniques and measures taken to mitigate this. It is shown that, due to the increasing Mobile Wireless Systems’ carbon footprint which carries serious negative environmental impact, and the general need to lower operating costs by the network operators, the management of energy consumption increases in priority. By using the architecture of a Fourth Generation Long Term Evolution (4G-LTE) UltraDense Network, the report further shows that more than 65% of the overall energy consumption is by the access network and base stations in particular. This phenomenon explains why most attention in energy efficiency management in UDNs is largely centred on reducing the energy consumption of the deployed base stations more than any other network components like the data servers or backhauling features used. Furthermore, the report also provides detailed information on the methods/techniques, their classification, implementation, as well as a critical analysis of the said implementations in literature. This study proposes a sub-optimal algorithm and Distributed Cell Resource Allocation with a Base Station On/Off scheme that aims at reducing the overall base station power consumption in UDNs, while ensuring that the overall Quality of Service (QoS) for each User Equipment (UE) as specified in its service class is met. The modeling of the system model used and hence formulation of the Network Energy Efficiency (NEE) optimization problem is done viii using stochastic geometry. The network model comprises both evolved Node B (eNB) type macro and small cells operating on different frequency bands as well as taking into account factors that impact NEE such as UE mobility, UE spatial distribution and small cells spatial distribution. The channel model takes into account signal interference from all base stations, path loss, fading, log normal shadowing, modulation and coding schemes used on each UE’s communication channels when computing throughout. The power consumption model used takes into account both static (site cooling, circuit power) and active (transmission or load based) base station power consumption. The formulation of the NEE optimization problem takes into consideration the user’s Quality-of-service (QoS), inter-cell interference, as well as each user’s spectral efficiency and coverage/success probability. The formulated NEE optimization problem is of type Nondeterministic Polynomial time (NP)-hard, due to the user-cell association. The proposed solution to the formulated optimization problem makes use of constraint relaxation to transform the NP-hard problem into a more solvable, convex and linear optimization one. This, combined with Lagrangian dual decomposition, is used to create a distributed solution. After cellassociation and resource allocation phases, the proposed solution in order to further reduce power consumption performs Cell On/Off. Then, by using the computer simulation tools/environments, the “Distributed Resource Allocation with Cell On/Off” scheme’s performance, in comparison to four other resource allocation schemes, is analysed and evaluated given a number of different network scenarios. Finally, the statistical and mathematical results generated through the simulations indicate that the proposed scheme is the closest in NEE performance to the Exhaustive Search algorithm, and hence superior to the other sub-optimal algorithms it is compared to
    • …
    corecore