8,789 research outputs found

    D2D-Based Grouped Random Access to Mitigate Mobile Access Congestion in 5G Sensor Networks

    Full text link
    The Fifth Generation (5G) wireless service of sensor networks involves significant challenges when dealing with the coordination of ever-increasing number of devices accessing shared resources. This has drawn major interest from the research community as many existing works focus on the radio access network congestion control to efficiently manage resources in the context of device-to-device (D2D) interaction in huge sensor networks. In this context, this paper pioneers a study on the impact of D2D link reliability in group-assisted random access protocols, by shedding the light on beneficial performance and potential limitations of approaches of this kind against tunable parameters such as group size, number of sensors and reliability of D2D links. Additionally, we leverage on the association with a Geolocation Database (GDB) capability to assist the grouping decisions by drawing parallels with recent regulatory-driven initiatives around GDBs and arguing benefits of the suggested proposal. Finally, the proposed method is approved to significantly reduce the delay over random access channels, by means of an exhaustive simulation campaign.Comment: First submission to IEEE Communications Magazine on Oct.28.2017. Accepted on Aug.18.2019. This is the camera-ready versio

    NOMA based resource allocation and mobility enhancement framework for IoT in next generation cellular networks

    Get PDF
    With the unprecedented technological advances witnessed in the last two decades, more devices are connected to the internet, forming what is called internet of things (IoT). IoT devices with heterogeneous characteristics and quality of experience (QoE) requirements may engage in dynamic spectrum market due to scarcity of radio resources. We propose a framework to efficiently quantify and supply radio resources to the IoT devices by developing intelligent systems. The primary goal of the paper is to study the characteristics of the next generation of cellular networks with non-orthogonal multiple access (NOMA) to enable connectivity to clustered IoT devices. First, we demonstrate how the distribution and QoE requirements of IoT devices impact the required number of radio resources in real time. Second, we prove that using an extended auction algorithm by implementing a series of complementary functions, enhance the radio resource utilization efficiency. The results show substantial reduction in the number of sub-carriers required when compared to conventional orthogonal multiple access (OMA) and the intelligent clustering is scalable and adaptable to the cellular environment. Ability to move spectrum usages from one cluster to other clusters after borrowing when a cluster has less user or move out of the boundary is another soft feature that contributes to the reported radio resource utilization efficiency. Moreover, the proposed framework provides IoT service providers cost estimation to control their spectrum acquisition to achieve required quality of service (QoS) with guaranteed bit rate (GBR) and non-guaranteed bit rate (Non-GBR)

    A Two-Phase Power Allocation Scheme for CRNs Employing NOMA

    Full text link
    In this paper, we consider the power allocation (PA) problem in cognitive radio networks (CRNs) employing nonorthogonal multiple access (NOMA) technique. Specifically, we aim to maximize the number of admitted secondary users (SUs) and their throughput, without violating the interference tolerance threshold of the primary users (PUs). This problem is divided into a two-phase PA process: a) maximizing the number of admitted SUs; b) maximizing the minimum throughput among the admitted SUs. To address the first phase, we apply a sequential and iterative PA algorithm, which fully exploits the characteristics of the NOMA-based system. Following this, the second phase is shown to be quasiconvex and is optimally solved via the bisection method. Furthermore, we prove the existence of a unique solution for the second phase and propose another PA algorithm, which is also optimal and significantly reduces the complexity in contrast with the bisection method. Simulation results verify the effectiveness of the proposed two-phase PA scheme

    A Taxonomy for Management and Optimization of Multiple Resources in Edge Computing

    Full text link
    Edge computing is promoted to meet increasing performance needs of data-driven services using computational and storage resources close to the end devices, at the edge of the current network. To achieve higher performance in this new paradigm one has to consider how to combine the efficiency of resource usage at all three layers of architecture: end devices, edge devices, and the cloud. While cloud capacity is elastically extendable, end devices and edge devices are to various degrees resource-constrained. Hence, an efficient resource management is essential to make edge computing a reality. In this work, we first present terminology and architectures to characterize current works within the field of edge computing. Then, we review a wide range of recent articles and categorize relevant aspects in terms of 4 perspectives: resource type, resource management objective, resource location, and resource use. This taxonomy and the ensuing analysis is used to identify some gaps in the existing research. Among several research gaps, we found that research is less prevalent on data, storage, and energy as a resource, and less extensive towards the estimation, discovery and sharing objectives. As for resource types, the most well-studied resources are computation and communication resources. Our analysis shows that resource management at the edge requires a deeper understanding of how methods applied at different levels and geared towards different resource types interact. Specifically, the impact of mobility and collaboration schemes requiring incentives are expected to be different in edge architectures compared to the classic cloud solutions. Finally, we find that fewer works are dedicated to the study of non-functional properties or to quantifying the footprint of resource management techniques, including edge-specific means of migrating data and services.Comment: Accepted in the Special Issue Mobile Edge Computing of the Wireless Communications and Mobile Computing journa

    Will 5G See its Blind Side? Evolving 5G for Universal Internet Access

    Get PDF
    Internet has shown itself to be a catalyst for economic growth and social equity but its potency is thwarted by the fact that the Internet is off limits for the vast majority of human beings. Mobile phones---the fastest growing technology in the world that now reaches around 80\% of humanity---can enable universal Internet access if it can resolve coverage problems that have historically plagued previous cellular architectures (2G, 3G, and 4G). These conventional architectures have not been able to sustain universal service provisioning since these architectures depend on having enough users per cell for their economic viability and thus are not well suited to rural areas (which are by definition sparsely populated). The new generation of mobile cellular technology (5G), currently in a formative phase and expected to be finalized around 2020, is aimed at orders of magnitude performance enhancement. 5G offers a clean slate to network designers and can be molded into an architecture also amenable to universal Internet provisioning. Keeping in mind the great social benefits of democratizing Internet and connectivity, we believe that the time is ripe for emphasizing universal Internet provisioning as an important goal on the 5G research agenda. In this paper, we investigate the opportunities and challenges in utilizing 5G for global access to the Internet for all (GAIA). We have also identified the major technical issues involved in a 5G-based GAIA solution and have set up a future research agenda by defining open research problems

    SDN/NFV-enabled satellite communications networks: opportunities, scenarios and challenges

    Get PDF
    In the context of next generation 5G networks, the satellite industry is clearly committed to revisit and revamp the role of satellite communications. As major drivers in the evolution of (terrestrial) fixed and mobile networks, Software Defined Networking (SDN) and Network Function Virtualisation (NFV) technologies are also being positioned as central technology enablers towards improved and more flexible integration of satellite and terrestrial segments, providing satellite network further service innovation and business agility by advanced network resources management techniques. Through the analysis of scenarios and use cases, this paper provides a description of the benefits that SDN/NFV technologies can bring into satellite communications towards 5G. Three scenarios are presented and analysed to delineate different potential improvement areas pursued through the introduction of SDN/NFV technologies in the satellite ground segment domain. Within each scenario, a number of use cases are developed to gain further insight into specific capabilities and to identify the technical challenges stemming from them.Peer ReviewedPostprint (author's final draft
    corecore