4,672 research outputs found

    Will SDN be part of 5G?

    Get PDF
    For many, this is no longer a valid question and the case is considered settled with SDN/NFV (Software Defined Networking/Network Function Virtualization) providing the inevitable innovation enablers solving many outstanding management issues regarding 5G. However, given the monumental task of softwarization of radio access network (RAN) while 5G is just around the corner and some companies have started unveiling their 5G equipment already, the concern is very realistic that we may only see some point solutions involving SDN technology instead of a fully SDN-enabled RAN. This survey paper identifies all important obstacles in the way and looks at the state of the art of the relevant solutions. This survey is different from the previous surveys on SDN-based RAN as it focuses on the salient problems and discusses solutions proposed within and outside SDN literature. Our main focus is on fronthaul, backward compatibility, supposedly disruptive nature of SDN deployment, business cases and monetization of SDN related upgrades, latency of general purpose processors (GPP), and additional security vulnerabilities, softwarization brings along to the RAN. We have also provided a summary of the architectural developments in SDN-based RAN landscape as not all work can be covered under the focused issues. This paper provides a comprehensive survey on the state of the art of SDN-based RAN and clearly points out the gaps in the technology.Comment: 33 pages, 10 figure

    Forward error correction in 5G heterogeneous network

    Get PDF
    In this research, the feasibility of developing a complete polar FEC chain of 5th generation cellular mobile communication standard in software. Specifically, on general purpose processors. Paper work attempts to achieve stringent latency requirements through software, algorithmic and platform specific optimizations. Many algorithms in FEC chain are optimized for hardware implementations. Direct implementation of these algorithms in software results in poor performance. To obtain best performance in terms of latency on general purpose processors, these algorithms are modified or reformulated to suit processor architecture and software implementation. Initially both encoding and decoding FEC chains are implemented naively without any optimization. Code profiling is performed on this naive implementation to identify the significant latency contributors. The research split algorithms of significant latency contributing components into primitive operations. These primitive operations are optimized either with software optimizations or mapped to specialized functional units of a general-purpose processor to achieve best performance using CRC calculation in 5G cellular networks. Optimizations reduced the worst-case latency of the encoding FEC chain from 158µs which is more than 10x reduction in latency with communication rate

    The AMSC mobile satellite system

    Get PDF
    The American Mobile Satellite Consortium (AMSC) Mobile Satellite Service (MSS) system is described. AMSC will use three multi-beam satellites to provide L-band MSS coverage to the United States, Canada and Mexico. The AMSC MSS system will have several noteworthy features, including a priority assignment processor that will ensure preemptive access to emergency services, a flexible SCPC channel scheme that will support a wide diversity of services, enlarged system capacity through frequency and orbit reuse, and high effective satellite transmitted power. Each AMSC satellite will make use of 14 MHz (bi-directional) of L-band spectrum. The Ku-band will be used for feeder links

    Resource Allocation in 4G and 5G Networks: A Review

    Get PDF
    The advent of 4G and 5G broadband wireless networks brings several challenges with respect to resource allocation in the networks. In an interconnected network of wireless devices, users, and devices, all compete for scarce resources which further emphasizes the fair and efficient allocation of those resources for the proper functioning of the networks. The purpose of this study is to discover the different factors that are involved in resource allocation in 4G and 5G networks. The methodology used was an empirical study using qualitative techniques by performing literature reviews on the state of art in 4G and 5G networks, analyze their respective architectures and resource allocation mechanisms, discover parameters, criteria and provide recommendations. It was observed that resource allocation is primarily done with radio resource in 4G and 5G networks, owing to their wireless nature, and resource allocation is measured in terms of delay, fairness, packet loss ratio, spectral efficiency, and throughput. Minimal consideration is given to other resources along the end-to-end 4G and 5G network architectures. This paper defines more types of resources, such as electrical energy, processor cycles and memory space, along end-to-end architectures, whose allocation processes need to be emphasized owing to the inclusion of software defined networking and network function virtualization in 5G network architectures. Thus, more criteria, such as electrical energy usage, processor cycle, and memory to evaluate resource allocation have been proposed.  Finally, ten recommendations have been made to enhance resource allocation along the whole 5G network architecture

    Quantifying Potential Energy Efficiency Gain in Green Cellular Wireless Networks

    Full text link
    Conventional cellular wireless networks were designed with the purpose of providing high throughput for the user and high capacity for the service provider, without any provisions of energy efficiency. As a result, these networks have an enormous Carbon footprint. In this paper, we describe the sources of the inefficiencies in such networks. First we present results of the studies on how much Carbon footprint such networks generate. We also discuss how much more mobile traffic is expected to increase so that this Carbon footprint will even increase tremendously more. We then discuss specific sources of inefficiency and potential sources of improvement at the physical layer as well as at higher layers of the communication protocol hierarchy. In particular, considering that most of the energy inefficiency in cellular wireless networks is at the base stations, we discuss multi-tier networks and point to the potential of exploiting mobility patterns in order to use base station energy judiciously. We then investigate potential methods to reduce this inefficiency and quantify their individual contributions. By a consideration of the combination of all potential gains, we conclude that an improvement in energy consumption in cellular wireless networks by two orders of magnitude, or even more, is possible.Comment: arXiv admin note: text overlap with arXiv:1210.843

    The AMSC mobile satellite system: Design summary and comparative analysis

    Get PDF
    Mobile satellite communications will be provided in the United States by the American Mobile Satellite Consortium (AMSC). Telesat Mobile, Inc. (TMI) and AMSC are jointly developing MSAT, the first regional Mobile Satellite Service (MSS) system. MSAT will provide diverse mobile communications services - including voice, data and position location - to mobiles on land, water, and in the air throughout North America. Described here are the institutional relationships between AMSC, TMI and other organizations participating in MSAT, including the Canadian Department of Communications and NASA. The regulatory status of MSAT in the United States and international allocations to MSS are reviewed. The baseline design is described
    corecore