1,077 research outputs found

    Modern computing: Vision and challenges

    Get PDF
    Over the past six decades, the computing systems field has experienced significant transformations, profoundly impacting society with transformational developments, such as the Internet and the commodification of computing. Underpinned by technological advancements, computer systems, far from being static, have been continuously evolving and adapting to cover multifaceted societal niches. This has led to new paradigms such as cloud, fog, edge computing, and the Internet of Things (IoT), which offer fresh economic and creative opportunities. Nevertheless, this rapid change poses complex research challenges, especially in maximizing potential and enhancing functionality. As such, to maintain an economical level of performance that meets ever-tighter requirements, one must understand the drivers of new model emergence and expansion, and how contemporary challenges differ from past ones. To that end, this article investigates and assesses the factors influencing the evolution of computing systems, covering established systems and architectures as well as newer developments, such as serverless computing, quantum computing, and on-device AI on edge devices. Trends emerge when one traces technological trajectory, which includes the rapid obsolescence of frameworks due to business and technical constraints, a move towards specialized systems and models, and varying approaches to centralized and decentralized control. This comprehensive review of modern computing systems looks ahead to the future of research in the field, highlighting key challenges and emerging trends, and underscoring their importance in cost-effectively driving technological progress

    Generative AI-empowered Simulation for Autonomous Driving in Vehicular Mixed Reality Metaverses

    Full text link
    In the vehicular mixed reality (MR) Metaverse, the distance between physical and virtual entities can be overcome by fusing the physical and virtual environments with multi-dimensional communications in autonomous driving systems. Assisted by digital twin (DT) technologies, connected autonomous vehicles (AVs), roadside units (RSU), and virtual simulators can maintain the vehicular MR Metaverse via digital simulations for sharing data and making driving decisions collaboratively. However, large-scale traffic and driving simulation via realistic data collection and fusion from the physical world for online prediction and offline training in autonomous driving systems are difficult and costly. In this paper, we propose an autonomous driving architecture, where generative AI is leveraged to synthesize unlimited conditioned traffic and driving data in simulations for improving driving safety and traffic efficiency. First, we propose a multi-task DT offloading model for the reliable execution of heterogeneous DT tasks with different requirements at RSUs. Then, based on the preferences of AV's DTs and collected realistic data, virtual simulators can synthesize unlimited conditioned driving and traffic datasets to further improve robustness. Finally, we propose a multi-task enhanced auction-based mechanism to provide fine-grained incentives for RSUs in providing resources for autonomous driving. The property analysis and experimental results demonstrate that the proposed mechanism and architecture are strategy-proof and effective, respectively

    Multi-objective resource optimization in space-aerial-ground-sea integrated networks

    Get PDF
    Space-air-ground-sea integrated (SAGSI) networks are envisioned to connect satellite, aerial, ground, and sea networks to provide connectivity everywhere and all the time in sixth-generation (6G) networks. However, the success of SAGSI networks is constrained by several challenges including resource optimization when the users have diverse requirements and applications. We present a comprehensive review of SAGSI networks from a resource optimization perspective. We discuss use case scenarios and possible applications of SAGSI networks. The resource optimization discussion considers the challenges associated with SAGSI networks. In our review, we categorized resource optimization techniques based on throughput and capacity maximization, delay minimization, energy consumption, task offloading, task scheduling, resource allocation or utilization, network operation cost, outage probability, and the average age of information, joint optimization (data rate difference, storage or caching, CPU cycle frequency), the overall performance of network and performance degradation, software-defined networking, and intelligent surveillance and relay communication. We then formulate a mathematical framework for maximizing energy efficiency, resource utilization, and user association. We optimize user association while satisfying the constraints of transmit power, data rate, and user association with priority. The binary decision variable is used to associate users with system resources. Since the decision variable is binary and constraints are linear, the formulated problem is a binary linear programming problem. Based on our formulated framework, we simulate and analyze the performance of three different algorithms (branch and bound algorithm, interior point method, and barrier simplex algorithm) and compare the results. Simulation results show that the branch and bound algorithm shows the best results, so this is our benchmark algorithm. The complexity of branch and bound increases exponentially as the number of users and stations increases in the SAGSI network. We got comparable results for the interior point method and barrier simplex algorithm to the benchmark algorithm with low complexity. Finally, we discuss future research directions and challenges of resource optimization in SAGSI networks

    Integration of hybrid networks, AI, Ultra Massive-MIMO, THz frequency, and FBMC modulation toward 6g requirements : A Review

    Get PDF
    The fifth-generation (5G) wireless communications have been deployed in many countries with the following features: wireless networks at 20 Gbps as peak data rate, a latency of 1-ms, reliability of 99.999%, maximum mobility of 500 km/h, a bandwidth of 1-GHz, and a capacity of 106 up to Mbps/m2. Nonetheless, the rapid growth of applications, such as extended/virtual reality (XR/VR), online gaming, telemedicine, cloud computing, smart cities, the Internet of Everything (IoE), and others, demand lower latency, higher data rates, ubiquitous coverage, and better reliability. These higher requirements are the main problems that have challenged 5G while concurrently encouraging researchers and practitioners to introduce viable solutions. In this review paper, the sixth-generation (6G) technology could solve the 5G limitations, achieve higher requirements, and support future applications. The integration of multiple access techniques, terahertz (THz), visible light communications (VLC), ultra-massive multiple-input multiple-output ( μm -MIMO), hybrid networks, cell-free massive MIMO, and artificial intelligence (AI)/machine learning (ML) have been proposed for 6G. The main contributions of this paper are a comprehensive review of the 6G vision, KPIs (key performance indicators), and advanced potential technologies proposed with operation principles. Besides, this paper reviewed multiple access and modulation techniques, concentrating on Filter-Bank Multicarrier (FBMC) as a potential technology for 6G. This paper ends by discussing potential applications with challenges and lessons identified from prior studies to pave the path for future research
    corecore