84 research outputs found

    Improve Quality of Service for the Internet of Things using Blockchain & Machine Learning Algorithms.

    Get PDF
    [EN] The quality of service (QoS) parameters in IoT applications plays a prominent role in determining the performance of an application. Considering the significance and popularity of IoT systems, it can be predicted that the number of users and IoT devices are going to increase exponentially shortly. Therefore, it is extremely important to improve the QoS provided by IoT applications to increase their adaptability. Majority of the IoT systems are characterized by their heterogeneous and diverse nature. It is challenging for these systems to provide high-quality access to all the connecting devices with uninterrupted connectivity. Considering their heterogeneity, it is equally difficult to achieve better QoS parameters. Artificial intelligence-based machine learning (ML) tools are considered a potential tool for improving the QoS parameters in IoT applications. This research proposes a novel approach for enhancing QoS parameters in IoT using ML and Blockchain techniques. The IoT network with Blockchain technology is simulated using an NS2 simulator. Different QoS parameters such as delay, throughput, packet delivery ratio, and packet drop are analyzed. The obtained QoS values are classified using different ML models such as Naive Bayes (NB), Decision Tree (DT), and Ensemble, learning techniques. Results show that the Ensemble classifier achieves the highest classification accuracy of 83.74% compared to NB and DT classifiers.SIPublicación en abierto financiada por el Consorcio de Bibliotecas Universitarias de Castilla y León (BUCLE), con cargo al Programa Operativo 2014ES16RFOP009 FEDER 2014-2020 DE CASTILLA Y LEÓN, Actuación:20007-CL - Apoyo Consorcio BUCL

    Overlay virtualized wireless sensor networks for application in industrial internet of things : a review

    Get PDF
    Abstract: In recent times, Wireless Sensor Networks (WSNs) are broadly applied in the Industrial Internet of Things (IIoT) in order to enhance the productivity and efficiency of existing and prospective manufacturing industries. In particular, an area of interest that concerns the use of WSNs in IIoT is the concept of sensor network virtualization and overlay networks. Both network virtualization and overlay networks are considered contemporary because they provide the capacity to create services and applications at the edge of existing virtual networks without changing the underlying infrastructure. This capability makes both network virtualization and overlay network services highly beneficial, particularly for the dynamic needs of IIoT based applications such as in smart industry applications, smart city, and smart home applications. Consequently, the study of both WSN virtualization and overlay networks has become highly patronized in the literature, leading to the growth and maturity of the research area. In line with this growth, this paper provides a review of the development made thus far concerning virtualized sensor networks, with emphasis on the application of overlay networks in IIoT. Principally, the process of virtualization in WSN is discussed along with its importance in IIoT applications. Different challenges in WSN are also presented along with possible solutions given by the use of virtualized WSNs. Further details are also presented concerning the use of overlay networks as the next step to supporting virtualization in shared sensor networks. Our discussion closes with an exposition of the existing challenges in the use of virtualized WSN for IIoT applications. In general, because overlay networks will be contributory to the future development and advancement of smart industrial and smart city applications, this review may be considered by researchers as a reference point for those particularly interested in the study of this growing field

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Resource Allocation in Networking and Computing Systems: A Security and Dependability Perspective

    Get PDF
    In recent years, there has been a trend to integrate networking and computing systems, whose management is getting increasingly complex. Resource allocation is one of the crucial aspects of managing such systems and is affected by this increased complexity. Resource allocation strategies aim to effectively maximize performance, system utilization, and profit by considering virtualization technologies, heterogeneous resources, context awareness, and other features. In such complex scenario, security and dependability are vital concerns that need to be considered in future computing and networking systems in order to provide the future advanced services, such as mission-critical applications. This paper provides a comprehensive survey of existing literature that considers security and dependability for resource allocation in computing and networking systems. The current research works are categorized by considering the allocated type of resources for different technologies, scenarios, issues, attributes, and solutions. The paper presents the research works on resource allocation that includes security and dependability, both singularly and jointly. The future research directions on resource allocation are also discussed. The paper shows how there are only a few works that, even singularly, consider security and dependability in resource allocation in the future computing and networking systems and highlights the importance of jointly considering security and dependability and the need for intelligent, adaptive and robust solutions. This paper aims to help the researchers effectively consider security and dependability in future networking and computing systems.publishedVersio

    Opportunities and Challenges of Joint Edge and Fog Orchestration

    Get PDF
    Pushing contents, applications, and network functions closer to end users is necessary to cope with the huge data volume and low latency required in future 5G networks. Edge and fog frameworks have emerged recently to address this challenge. Whilst the edge framework was more infrastructure focused and more mobile operator-oriented, the fog was more pervasive and included any node (stationary or mobile), including terminal devices. This article analyzes the opportunities and challenges to integrate, federate, and jointly orchestrate the edge and fog resources into a unified framework.This work has been partially funded by the H2020 collaborative Europe/Taiwan research project 5G-CORAL (grant num. 761586

    Service Provisioning in Edge-Cloud Continuum Emerging Applications for Mobile Devices

    Get PDF
    Disruptive applications for mobile devices can be enhanced by Edge computing facilities. In this context, Edge Computing (EC) is a proposed architecture to meet the mobility requirements imposed by these applications in a wide range of domains, such as the Internet of Things, Immersive Media, and Connected and Autonomous Vehicles. EC architecture aims to introduce computing capabilities in the path between the user and the Cloud to execute tasks closer to where they are consumed, thus mitigating issues related to latency, context awareness, and mobility support. In this survey, we describe which are the leading technologies to support the deployment of EC infrastructure. Thereafter, we discuss the applications that can take advantage of EC and how they were proposed in the literature. Finally, after examining enabling technologies and related applications, we identify some open challenges to fully achieve the potential of EC, and also research opportunities on upcoming paradigms for service provisioning. This survey is a guide to comprehend the recent advances on the provisioning of mobile applications, as well as foresee the expected next stages of evolution for these applications

    Block Chain Technology Assisted Privacy Preserving Resource Allocation Scheme for Internet of Things Based Cloud Computing

    Get PDF
    Resource scheduling in cloud environments is a complex task, as it involves allocating suitable resources based on Quality of Service (QoS) requirements. Existing resource allocation policies face challenges due to resource dispersion, heterogeneity, and uncertainty. In this research, the authors propose a novel approach called Quasi-Oppositional Artificial Jellyfish Optimization Algorithm (QO-AJFOA) for resource scheduling in cloud computing (CC) environments. The QO-AJFOA model aims to optimize the allocation of computing power and bandwidth resources in servers, with the goal of maximizing long-term utility. The technique combines quasi-oppositional based learning (QOBL) with traditional AJFOA. Additionally, a blockchain-assisted Smart Contract protocol is used to distribute resource allocation, ensuring agreement on wireless channel utilization. Experimental validation of the QO-AJFOA technique demonstrates its promising performance compared to recent models, as tested with varying numbers of tasks and iterations. The proposed approach addresses the challenges of resource scheduling in cloud environments and contributes to the existing literature on resource allocation policies

    The Cloud-to-Thing Continuum

    Get PDF
    The Internet of Things offers massive societal and economic opportunities while at the same time significant challenges, not least the delivery and management of the technical infrastructure underpinning it, the deluge of data generated from it, ensuring privacy and security, and capturing value from it. This Open Access Pivot explores these challenges, presenting the state of the art and future directions for research but also frameworks for making sense of this complex area. This book provides a variety of perspectives on how technology innovations such as fog, edge and dew computing, 5G networks, and distributed intelligence are making us rethink conventional cloud computing to support the Internet of Things. Much of this book focuses on technical aspects of the Internet of Things, however, clear methodologies for mapping the business value of the Internet of Things are still missing. We provide a value mapping framework for the Internet of Things to address this gap. While there is much hype about the Internet of Things, we have yet to reach the tipping point. As such, this book provides a timely entrée for higher education educators, researchers and students, industry and policy makers on the technologies that promise to reshape how society interacts and operates

    A survey on software-defined wireless sensor networks : challenges and design requirements

    Get PDF
    Software defined networking (SDN) brings about innovation, simplicity in network management, and configuration in network computing. Traditional networks often lack the flexibility to bring into effect instant changes because of the rigidity of the network and also the over dependence on proprietary services. SDN decouples the control plane from the data plane, thus moving the control logic from the node to a central controller. A wireless sensor network (WSN) is a great platform for low-rate wireless personal area networks with little resources and short communication ranges. However, as the scale of WSN expands, it faces several challenges, such as network management and heterogeneous-node networks. The SDN approach to WSNs seeks to alleviate most of the challenges and ultimately foster efficiency and sustainability in WSNs. The fusion of these two models gives rise to a new paradigm: Software defined wireless sensor networks (SDWSN). The SDWSN model is also envisioned to play a critical role in the looming Internet of Things paradigm. This paper presents a comprehensive review of the SDWSN literature. Moreover, it delves into some of the challenges facing this paradigm, as well as the major SDWSN design requirements that need to be considered to address these challenges.http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6287639hb2017Electrical, Electronic and Computer Engineerin
    corecore