43 research outputs found

    5GAuRA. D3.3: RAN Analytics Mechanisms and Performance Benchmarking of Video, Time Critical, and Social Applications

    Get PDF
    5GAuRA deliverable D3.3.This is the final deliverable of Work Package 3 (WP3) of the 5GAuRA project, providing a report on the project’s developments on the topics of Radio Access Network (RAN) analytics and application performance benchmarking. The focus of this deliverable is to extend and deepen the methods and results provided in the 5GAuRA deliverable D3.2 in the context of specific use scenarios of video, time critical, and social applications. In this respect, four major topics of WP3 of 5GAuRA – namely edge-cloud enhanced RAN architecture, machine learning assisted Random Access Channel (RACH) approach, Multi-access Edge Computing (MEC) content caching, and active queue management – are put forward. Specifically, this document provides a detailed discussion on the service level agreement between tenant and service provider in the context of network slicing in Fifth Generation (5G) communication networks. Network slicing is considered as a key enabler to 5G communication system. Legacy telecommunication networks have been providing various services to all kinds of customers through a single network infrastructure. In contrast, by deploying network slicing, operators are now able to partition one network into individual slices, each with its own configuration and Quality of Service (QoS) requirements. There are many applications across industry that open new business opportunities with new business models. Every application instance requires an independent slice with its own network functions and features, whereby every single slice needs an individual Service Level Agreement (SLA). In D3.3, we propose a comprehensive end-to-end structure of SLA between the tenant and the service provider of sliced 5G network, which balances the interests of both sides. The proposed SLA defines reliability, availability, and performance of delivered telecommunication services in order to ensure that right information is delivered to the right destination at right time, safely and securely. We also discuss the metrics of slicebased network SLA such as throughput, penalty, cost, revenue, profit, and QoS related metrics, which are, in the view of 5GAuRA, critical features of the agreement.Peer ReviewedPostprint (published version

    Tutorial on LTE/LTE-A Cellular Network Dimensioning Using Iterative Statistical Analysis

    Get PDF
    LTE is the fastest growing cellular technology and is expected to increase its footprint in the coming years, as well as progress toward LTE-A. The race among operators to deliver the expected quality of experience to their users is tight and demands sophisticated skills in network planning. Radio network dimensioning (RND) is an essential step in the process of network planning and has been used as a fast, but indicative, approximation of radio site count. RND is a prerequisite to the lengthy process of thorough planning. Moreover, results from RND are used by players in the industry to estimate preplanning costs of deploying and running a network; thus, RND is, as well, a key tool in cellular business modelling. In this work, we present a tutorial on radio network dimensioning, focused on LTE/LTE-A, using an iterative approach to find a balanced design that mediates among the three design requirements: coverage, capacity, and quality. This approach uses a statistical link budget analysis methodology, which jointly accounts for small and large scale fading in the channel, as well as loading due to traffic demand, in the interference calculation. A complete RND manual is thus presented, which is of key importance to operators deploying or upgrading LTE/LTE-A networks for two reasons. It is purely analytical, hence it enables fast results, a prime factor in the race undertaken. Moreover, it captures essential variables affecting network dimensions and manages conflicting targets to ensure user quality of experience, another major criterion in the competition. The described approach is compared to the traditional RND using a commercial LTE network planning tool. The outcome further dismisses the traditional RND for LTE due to unjustified increase in number of radio sites and related cost, and motivates further research in developing more effective and novel RND procedures

    5G Wireless Communication Network Architecture and Its Key Enabling Technologies

    Get PDF
    The wireless mobile communication systems have developed from the second generation (2G) through to the current fourth generation (4G) wireless system, transforming from simply telephony system to a network transporting rich multimedia contents including video conferencing, 3-D gaming and in-flight broadband connectivity (IFBC) where airline crew use augmented reality headsets to address passengers personally. However, there are still many challenges that are beyond the capabilities of the 4G as the demand for higher data rate, lower latency, and mobility requirement by new wireless applications sores leading to mixed contentcentric communication service. The fifth generation (5G) wireless system has thus been suggested, and research is ongoing for its deployment beyond 2020. In this article, we investigate the various challenges of 4G and propose an indoor, outdoor segregated cellular architecture with cloudbased Radio Access Network (C-RAN) for 5G, we review some of its key emerging wireless technologies needed in meeting the new demands of users including massive multiple input multiple output (mMIMO) system, Device-to-Device (D2D), Visible Light Communication (VLC), Ultra-dense network, Spatial Modulation and Millimeter wave technology. It is also shown how the benefits of the emerging technologies can be optimized using the Software Defined Networks/Network Functions Virtualization (SDN/NFV) as a tool in C-RAN. We conclude that the new 5G wireless architecture will derive its strength from leveraging on the benefits of the emerging hardware technologies been managed by reconfigurable SDN/NFV via the C-RAN. This work will be of immense help to those who will engage in further research expedition and network operators in the search for a smooth evolution of the current state of the art networks toward 5G networks

    D13.1 Fundamental issues on energy- and bandwidth-efficient communications and networking

    Get PDF
    Deliverable D13.1 del projecte europeu NEWCOM#The report presents the current status in the research area of energy- and bandwidth-efficient communications and networking and highlights the fundamental issues still open for further investigation. Furthermore, the report presents the Joint Research Activities (JRAs) which will be performed within WP1.3. For each activity there is the description, the identification of the adherence with the identified fundamental open issues, a presentation of the initial results, and a roadmap for the planned joint research work in each topic.Preprin

    Planning Wireless Cellular Networks of Future: Outlook, Challenges and Opportunities

    Get PDF
    Cell planning (CP) is the most important phase in the life cycle of a cellular system as it determines the operational expenditure, capital expenditure, as well as the long-term performance of the system. Therefore, it is not surprising that CP problems have been studied extensively for the past three decades for all four generations of cellular systems. However, the fact that small cells, a major component of future networks, are anticipated to be deployed in an impromptu fashion makes CP for future networks vis-a-vis 5G a conundrum. Furthermore, in emerging cellular systems that incorporate a variety of different cell sizes and types, heterogeneous networks (HetNets), energy efficiency, self-organizing network features, control and data plane split architectures (CDSA), massive multiple input multiple out (MIMO), coordinated multipoint (CoMP), cloud radio access network, and millimetre-wave-based cells plus the need to support Internet of Things (IoT) and device-to-device (D2D) communication require a major paradigm shift in the way cellular networks have been planned in the past. The objective of this paper is to characterize this paradigm shift by concisely reviewing past developments, analyzing the state-of-the-art challenges, and identifying future trends, challenges, and opportunities in CP in the wake of 5G. More specifically, in this paper, we investigate the problem of planning future cellular networks in detail. To this end, we first provide a brief tutorial on the CP process to identify the peculiarities that make CP one of the most challenging problems in wireless communications. This tutorial is followed by a concise recap of past research in CP. We then review key findings from recent studies that have attempted to address the aforementioned challenges in planning emerging networks. Finally, we discuss the range of technical factors that need to be taken into account while planning future networks and the promising research directions that necessitates the paradigm shift to do so

    Statistical priority-based uplink scheduling for M2M communications

    Get PDF
    Currently, the worldwide network is witnessing major efforts to transform it from being the Internet of humans only to becoming the Internet of Things (IoT). It is expected that Machine Type Communication Devices (MTCDs) will overwhelm the cellular networks with huge traffic of data that they collect from their environments to be sent to other remote MTCDs for processing thus forming what is known as Machine-to-Machine (M2M) communications. Long Term Evolution (LTE) and LTE-Advanced (LTE-A) appear as the best technology to support M2M communications due to their native IP support. LTE can provide high capacity, flexible radio resource allocation and scalability, which are the required pillars for supporting the expected large numbers of deployed MTCDs. Supporting M2M communications over LTE faces many challenges. These challenges include medium access control and the allocation of radio resources among MTCDs. The problem of radio resources allocation, or scheduling, originates from the nature of M2M traffic. This traffic consists of a large number of small data packets, with specific deadlines, generated by a potentially massive number of MTCDs. M2M traffic is therefore mostly in the uplink direction, i.e. from MTCDs to the base station (known as eNB in LTE terminology). These characteristics impose some design requirements on M2M scheduling techniques such as the need to use insufficient radio resources to transmit a huge amount of traffic within certain deadlines. This presents the main motivation behind this thesis work. In this thesis, we introduce a novel M2M scheduling scheme that utilizes what we term the “statistical priority” in determining the importance of information carried by data packets. Statistical priority is calculated based on the statistical features of the data such as value similarity, trend similarity and auto-correlation. These calculations are made and then reported by the MTCDs to the serving eNBs along with other reports such as channel state. Statistical priority is then used to assign priorities to data packets so that the scarce radio resources are allocated to the MTCDs that are sending statistically important information. This would help avoid exploiting limited radio resources to carry redundant or repetitive data which is a common situation in M2M communications. In order to validate our technique, we perform a simulation-based comparison among the main scheduling techniques and our proposed statistical priority-based scheduling technique. This comparison was conducted in a network that includes different types of MTCDs, such as environmental monitoring sensors, surveillance cameras and alarms. The results show that our proposed statistical priority-based scheduler outperforms the other schedulers in terms of having the least losses of alarm data packets and the highest rate in sending critical data packets that carry non-redundant information for both environmental monitoring and video traffic. This indicates that the proposed technique is the most efficient in the utilization of limited radio resources as compared to the other techniques

    Intelligent Resource Allocation in 5G Multi-Radio Heterogeneous Networks

    Get PDF
    The fast-moving evolution of wireless networks, which started less than three decades ago, has resulted in worldwide connectivity and influenced the development of a global market in all related areas. However, in recent years, the growing user traffic demands have led to the saturation of licensed and unlicensed frequency bands regarding capacity and load-over-time. On the physical layer the used spectrum efficiency is already close to Shannon’s limit; however the traffic demand continues to grow, forcing mobile network operators and equipment manufacturers to evaluate more effective strategies of the wireless medium access.One of these strategies, called cell densification, implies there are a growing number of serving entities, with the appropriate reduction of the per-cell coverage area. However, if implemented blindly, this approach will lead to a significant growth in the average interference level and overhead control signaling, which are both required to allow sufficient user mobility. Furthermore, the interference is also affected by the increasing variety of radio access technologies (RATs) and applications, often deployed without the necessary level of cooperation with technologies that are already in place.To overcome these problems today’s telecommunication standardization groups are trying to collaborate. That is why the recent agenda of the fifth generation wireless networks (5G) includes not only the development schedules for the particular technologies but also implies there should be an expansion of the appropriate interconnection techniques. In this thesis, we describe and evaluate the concept of heterogeneous networks (HetNets), which involve the cooperation between several RATs.In the introductory part, we discuss the set of the problems, related to HetNets, and review the HetNet development process. Moreover, we show the evolution of existing and potential segments of the multi-RAT 5G network, together with the most promising applications, which could be used in future HetNets.Further, in the thesis, we describe the set of key representative scenarios, including three-tier WiFi-LTE multi-RAT deployment, MTC-enabled LTE, and the mmWave-based network. For each of these scenarios, we define a set of unsolved issues and appropriate solutions. For the WiFi-LTE multi-RAT scenario, we develop the framework, enabling intelligent and flexible resource allocation between the involved RATs. For MTC-enabled LTE, we study the effect of massive MTC deployments on the performance of LTE random access procedure and propose some basic methods to improve its efficiency. Finally, for the mmWave scenario, we study the effects of connectivity strategies, human body blockage and antenna array configuration on the overall network performance. Next, we develop a set of validated analytical and simulation-based techniques which allow us to evaluate the performance of proposed solutions. At the end of the introductory part a set of HetNet-related demo activities is demonstrated

    A Comprehensive Survey of the Tactile Internet: State of the art and Research Directions

    Get PDF
    The Internet has made several giant leaps over the years, from a fixed to a mobile Internet, then to the Internet of Things, and now to a Tactile Internet. The Tactile Internet goes far beyond data, audio and video delivery over fixed and mobile networks, and even beyond allowing communication and collaboration among things. It is expected to enable haptic communication and allow skill set delivery over networks. Some examples of potential applications are tele-surgery, vehicle fleets, augmented reality and industrial process automation. Several papers already cover many of the Tactile Internet-related concepts and technologies, such as haptic codecs, applications, and supporting technologies. However, none of them offers a comprehensive survey of the Tactile Internet, including its architectures and algorithms. Furthermore, none of them provides a systematic and critical review of the existing solutions. To address these lacunae, we provide a comprehensive survey of the architectures and algorithms proposed to date for the Tactile Internet. In addition, we critically review them using a well-defined set of requirements and discuss some of the lessons learned as well as the most promising research directions
    corecore