334 research outputs found

    Performance Analytical Modelling of Mobile Edge Computing for Mobile Vehicular Applications: A Worst-Case Perspective

    Get PDF
    Quantitative performance analysis plays a pivotal role in theoretically investigating the performance of Vehicular Edge Computing (VEC) systems. Although considerable research efforts have been devoted to VEC performance analysis, all of the existing analytical models were designed to derive the average system performance, paying insufficient attention to the worst-case performance analysis, which hinders the practical deployment of VEC systems to support mission-critical vehicular applications, such as collision avoidance. To bridge this gap, we develop an original performance analytical model by virtue of Stochastic Network Calculus (SNC) to investigate the worst-case end-to-end performance of VEC systems. Specifically, to capture the bursty feature of task generation, an innovative bivariate Markov Chain is firstly established and rigorously analysed to derive the stochastic task envelope. Then, an effective service curve is created to investigate the severe resource competition among vehicular applications. Driven by the stochastic task envelope and effective service curve, a closed-form end-to-end analytical model is derived to obtain the latency bound for VEC systems. Extensive simulation experiments are conducted to validate the accuracy of the proposed analytical model under different system configurations. Furthermore, we exploit the proposed analytical model as a cost-effective tool to investigate the resource allocation strategies in VEC systems

    Cybersecurity in Motion: A Survey of Challenges and Requirements for Future Test Facilities of CAVs

    Get PDF
    The way we travel is changing rapidly and Cooperative Intelligent Transportation Systems (C-ITSs) are at the forefront of this evolution. However, the adoption of C-ITSs introduces new risks and challenges, making cybersecurity a top priority for ensuring safety and reliability. Building on this premise, this paper introduces an envisaged Cybersecurity Centre of Excellence (CSCE) designed to bolster researching, testing, and evaluating the cybersecurity of C-ITSs. We explore the design, functionality, and challenges of CSCE's testing facilities, outlining the technological, security, and societal requirements. Through a thorough survey and analysis, we assess the effectiveness of these systems in detecting and mitigating potential threats, highlighting their flexibility to adapt to future C-ITSs. Finally, we identify current unresolved challenges in various C-ITS domains, with the aim of motivating further research into the cybersecurity of C-ITSs

    Modern computing: Vision and challenges

    Get PDF
    Over the past six decades, the computing systems field has experienced significant transformations, profoundly impacting society with transformational developments, such as the Internet and the commodification of computing. Underpinned by technological advancements, computer systems, far from being static, have been continuously evolving and adapting to cover multifaceted societal niches. This has led to new paradigms such as cloud, fog, edge computing, and the Internet of Things (IoT), which offer fresh economic and creative opportunities. Nevertheless, this rapid change poses complex research challenges, especially in maximizing potential and enhancing functionality. As such, to maintain an economical level of performance that meets ever-tighter requirements, one must understand the drivers of new model emergence and expansion, and how contemporary challenges differ from past ones. To that end, this article investigates and assesses the factors influencing the evolution of computing systems, covering established systems and architectures as well as newer developments, such as serverless computing, quantum computing, and on-device AI on edge devices. Trends emerge when one traces technological trajectory, which includes the rapid obsolescence of frameworks due to business and technical constraints, a move towards specialized systems and models, and varying approaches to centralized and decentralized control. This comprehensive review of modern computing systems looks ahead to the future of research in the field, highlighting key challenges and emerging trends, and underscoring their importance in cost-effectively driving technological progress

    Adaptive Data-driven Optimization using Transfer Learning for Resilient, Energy-efficient, Resource-aware, and Secure Network Slicing in 5G-Advanced and 6G Wireless Systems

    Get PDF
    Title from PDF of title page, viewed January 31, 2023Dissertation advisor: Cory BeardVitaIncludes bibliographical references (pages 134-141)Dissertation (Ph.D)--Department of Computer Science and Electrical Engineering. University of Missouri--Kansas City, 20225G–Advanced is the next step in the evolution of the fifth–generation (5G) technology. It will introduce a new level of expanded capabilities beyond connections and enables a broader range of advanced applications and use cases. 5G–Advanced will support modern applications with greater mobility and high dependability. Artificial intelligence and Machine Learning will enhance network performance with spectral efficiency and energy savings enhancements. This research established a framework to optimally control and manage an appropriate selection of network slices for incoming requests from diverse applications and services in Beyond 5G networks. The developed DeepSlice model is used to optimize the network and individual slice load efficiency across isolated slices and manage slice lifecycle in case of failure. The DeepSlice framework can predict the unknown connections by utilizing the learning from a developed deep-learning neural network model. The research also addresses threats to the performance, availability, and robustness of B5G networks by proactively preventing and resolving threats. The study proposed a Secure5G framework for authentication, authorization, trust, and control for a network slicing architecture in 5G systems. The developed model prevents the 5G infrastructure from Distributed Denial of Service by analyzing incoming connections and learning from the developed model. The research demonstrates the preventive measure against volume attacks, flooding attacks, and masking (spoofing) attacks. This research builds the framework towards the zero trust objective (never trust, always verify, and verify continuously) that improves resilience. Another fundamental difficulty for wireless network systems is providing a desirable user experience in various network conditions, such as those with varying network loads and bandwidth fluctuations. Mobile Network Operators have long battled unforeseen network traffic events. This research proposed ADAPTIVE6G to tackle the network load estimation problem using knowledge-inspired Transfer Learning by utilizing radio network Key Performance Indicators from network slices to understand and learn network load estimation problems. These algorithms enable Mobile Network Operators to optimally coordinate their computational tasks in stochastic and time-varying network states. Energy efficiency is another significant KPI in tracking the sustainability of network slicing. Increasing traffic demands in 5G dramatically increase the energy consumption of mobile networks. This increase is unsustainable in terms of dollar cost and environmental impact. This research proposed an innovative ECO6G model to attain sustainability and energy efficiency. Research findings suggested that the developed model can reduce network energy costs without negatively impacting performance or end customer experience against the classical Machine Learning and Statistical driven models. The proposed model is validated against the industry-standardized energy efficiency definition, and operational expenditure savings are derived, showing significant cost savings to MNOs.Introduction -- A deep neural network framework towards a resilient, efficient, and secure network slicing in Beyond 5G Networks -- Adaptive resource management techniques for network slicing in Beyond 5G networks using transfer learning -- Energy and cost analysis for network slicing deployment in Beyond 5G networks -- Conclusion and future scop

    Metaverse: A Vision, Architectural Elements, and Future Directions for Scalable and Realtime Virtual Worlds

    Full text link
    With the emergence of Cloud computing, Internet of Things-enabled Human-Computer Interfaces, Generative Artificial Intelligence, and high-accurate Machine and Deep-learning recognition and predictive models, along with the Post Covid-19 proliferation of social networking, and remote communications, the Metaverse gained a lot of popularity. Metaverse has the prospective to extend the physical world using virtual and augmented reality so the users can interact seamlessly with the real and virtual worlds using avatars and holograms. It has the potential to impact people in the way they interact on social media, collaborate in their work, perform marketing and business, teach, learn, and even access personalized healthcare. Several works in the literature examine Metaverse in terms of hardware wearable devices, and virtual reality gaming applications. However, the requirements of realizing the Metaverse in realtime and at a large-scale need yet to be examined for the technology to be usable. To address this limitation, this paper presents the temporal evolution of Metaverse definitions and captures its evolving requirements. Consequently, we provide insights into Metaverse requirements. In addition to enabling technologies, we lay out architectural elements for scalable, reliable, and efficient Metaverse systems, and a classification of existing Metaverse applications along with proposing required future research directions

    Integrating Edge Computing and Software Defined Networking in Internet of Things: A Systematic Review

    Get PDF
    The Internet of Things (IoT) has transformed our interaction with the world by connecting devices, sensors, and systems to the Internet, enabling real-time monitoring, control, and automation in various applications such as smart cities, healthcare, transportation, homes, and grids. However, challenges related to latency, privacy, and bandwidth have arisen due to the massive influx of data generated by IoT devices and the limitations of traditional cloud-based architectures. Moreover, network management, interoperability, security, and scalability issues have emerged due to the rapid growth and heterogeneous nature of IoT devices. To overcome such problems, researchers proposed a new architecture called Software Defined Networking for Edge Computing in the Internet of Things (SDN-EC-IoT), which combines Edge Computing for the Internet of Things (EC-IoT) and Software Defined Internet of Things (SDIoT). Although researchers have studied EC-IoT and SDIoT as individual architectures, they have not yet addressed the combination of both, creating a significant gap in our understanding of SDN-EC-IoT. This paper aims to fill this gap by presenting a comprehensive review of how the SDN-EC-IoT paradigm can solve IoT challenges. To achieve this goal, this study conducted a literature review covering 74 articles published between 2019 and 2023. Finally, this paper identifies future research directions for SDN-EC-IoT, including the development of interoperability platforms, scalable architectures, low latency and Quality of Service (QoS) guarantees, efficient handling of big data, enhanced security and privacy, optimized energy consumption, resource-aware task offloading, and incorporation of machine learnin
    • …
    corecore