900 research outputs found

    Optimization of Beyond 5G Network Slicing for Smart City Applications

    Get PDF
    Transitioning from the current fifth-generation (5G) wireless technology, the advent of beyond 5G (B5G) signifies a pivotal stride toward sixth generation (6G) communication technology. B5G, at its essence, harnesses end-to-end (E2E) network slicing (NS) technology, enabling the simultaneous accommodation of multiple logical networks with distinct performance requirements on a shared physical infrastructure. At the forefront of this implementation lies the critical process of network slice design, a phase central to the realization of efficient smart city networks. This thesis assumes a key role in the network slicing life cycle, emphasizing the analysis and formulation of optimal procedures for configuring, customizing, and allocating E2E network slices. The focus extends to catering to the unique demands of smart city applications, encompassing critical areas such as emergency response, smart buildings, and video surveillance. By addressing the intricacies of network slice design, the study navigates through the complexities of tailoring slices to meet specific application needs, thereby contributing to the seamless integration of diverse services within the smart city framework. Addressing the core challenge of NS, which involves the allocation of virtual networks on the physical topology with optimal resource allocation, the thesis introduces a dual integer linear programming (ILP) optimization problem. This problem is formulated to jointly minimize the embedding cost and latency. However, given the NP-hard nature of this ILP, finding an efficient alternative becomes a significant hurdle. In response, this thesis introduces a novel heuristic approach the matroid-based modified greedy breadth-first search (MGBFS) algorithm. This pioneering algorithm leverages matroid properties to navigate the process of virtual network embedding and resource allocation. By introducing this novel heuristic approach, the research aims to provide near-optimal solutions, overcoming the computational complexities associated with the dual integer linear programming problem. The proposed MGBFS algorithm not only addresses the connectivity, cost, and latency constraints but also outperforms the benchmark model delivering solutions remarkably close to optimal. This innovative approach represents a substantial advancement in the optimization of smart city applications, promising heightened connectivity, efficiency, and resource utilization within the evolving landscape of B5G-enabled communication technology

    Quality of experience and access network traffic management of HTTP adaptive video streaming

    Get PDF
    The thesis focuses on Quality of Experience (QoE) of HTTP adaptive video streaming (HAS) and traffic management in access networks to improve the QoE of HAS. First, the QoE impact of adaptation parameters and time on layer was investigated with subjective crowdsourcing studies. The results were used to compute a QoE-optimal adaptation strategy for given video and network conditions. This allows video service providers to develop and benchmark improved adaptation logics for HAS. Furthermore, the thesis investigated concepts to monitor video QoE on application and network layer, which can be used by network providers in the QoE-aware traffic management cycle. Moreover, an analytic and simulative performance evaluation of QoE-aware traffic management on a bottleneck link was conducted. Finally, the thesis investigated socially-aware traffic management for HAS via Wi-Fi offloading of mobile HAS flows. A model for the distribution of public Wi-Fi hotspots and a platform for socially-aware traffic management on private home routers was presented. A simulative performance evaluation investigated the impact of Wi-Fi offloading on the QoE and energy consumption of mobile HAS.Die Doktorarbeit beschäftigt sich mit Quality of Experience (QoE) – der subjektiv empfundenen Dienstgüte – von adaptivem HTTP Videostreaming (HAS) und mit Verkehrsmanagement, das in Zugangsnetzwerken eingesetzt werden kann, um die QoE des adaptiven Videostreamings zu verbessern. Zuerst wurde der Einfluss von Adaptionsparameters und der Zeit pro Qualitätsstufe auf die QoE von adaptivem Videostreaming mittels subjektiver Crowdsourcingstudien untersucht. Die Ergebnisse wurden benutzt, um die QoE-optimale Adaptionsstrategie für gegebene Videos und Netzwerkbedingungen zu berechnen. Dies ermöglicht Dienstanbietern von Videostreaming verbesserte Adaptionsstrategien für adaptives Videostreaming zu entwerfen und zu benchmarken. Weiterhin untersuchte die Arbeit Konzepte zum Überwachen von QoE von Videostreaming in der Applikation und im Netzwerk, die von Netzwerkbetreibern im Kreislauf des QoE-bewussten Verkehrsmanagements eingesetzt werden können. Außerdem wurde eine analytische und simulative Leistungsbewertung von QoE-bewusstem Verkehrsmanagement auf einer Engpassverbindung durchgeführt. Schließlich untersuchte diese Arbeit sozialbewusstes Verkehrsmanagement für adaptives Videostreaming mittels WLAN Offloading, also dem Auslagern von mobilen Videoflüssen über WLAN Netzwerke. Es wurde ein Modell für die Verteilung von öffentlichen WLAN Zugangspunkte und eine Plattform für sozialbewusstes Verkehrsmanagement auf privaten, häuslichen WLAN Routern vorgestellt. Abschließend untersuchte eine simulative Leistungsbewertung den Einfluss von WLAN Offloading auf die QoE und den Energieverbrauch von mobilem adaptivem Videostreaming

    Synthetic Aperture Radar (SAR) Meets Deep Learning

    Get PDF
    This reprint focuses on the application of the combination of synthetic aperture radars and depth learning technology. It aims to further promote the development of SAR image intelligent interpretation technology. A synthetic aperture radar (SAR) is an important active microwave imaging sensor, whose all-day and all-weather working capacity give it an important place in the remote sensing community. Since the United States launched the first SAR satellite, SAR has received much attention in the remote sensing community, e.g., in geological exploration, topographic mapping, disaster forecast, and traffic monitoring. It is valuable and meaningful, therefore, to study SAR-based remote sensing applications. In recent years, deep learning represented by convolution neural networks has promoted significant progress in the computer vision community, e.g., in face recognition, the driverless field and Internet of things (IoT). Deep learning can enable computational models with multiple processing layers to learn data representations with multiple-level abstractions. This can greatly improve the performance of various applications. This reprint provides a platform for researchers to handle the above significant challenges and present their innovative and cutting-edge research results when applying deep learning to SAR in various manuscript types, e.g., articles, letters, reviews and technical reports

    Integrating Edge Computing and Software Defined Networking in Internet of Things: A Systematic Review

    Get PDF
    The Internet of Things (IoT) has transformed our interaction with the world by connecting devices, sensors, and systems to the Internet, enabling real-time monitoring, control, and automation in various applications such as smart cities, healthcare, transportation, homes, and grids. However, challenges related to latency, privacy, and bandwidth have arisen due to the massive influx of data generated by IoT devices and the limitations of traditional cloud-based architectures. Moreover, network management, interoperability, security, and scalability issues have emerged due to the rapid growth and heterogeneous nature of IoT devices. To overcome such problems, researchers proposed a new architecture called Software Defined Networking for Edge Computing in the Internet of Things (SDN-EC-IoT), which combines Edge Computing for the Internet of Things (EC-IoT) and Software Defined Internet of Things (SDIoT). Although researchers have studied EC-IoT and SDIoT as individual architectures, they have not yet addressed the combination of both, creating a significant gap in our understanding of SDN-EC-IoT. This paper aims to fill this gap by presenting a comprehensive review of how the SDN-EC-IoT paradigm can solve IoT challenges. To achieve this goal, this study conducted a literature review covering 74 articles published between 2019 and 2023. Finally, this paper identifies future research directions for SDN-EC-IoT, including the development of interoperability platforms, scalable architectures, low latency and Quality of Service (QoS) guarantees, efficient handling of big data, enhanced security and privacy, optimized energy consumption, resource-aware task offloading, and incorporation of machine learnin

    Private 5G and its Suitability for Industrial Networking

    Get PDF
    5G was and is still surrounded by many promises and buzzwords, such as the famous 1 ms, real-time, and Ultra-Reliable and Low-Latency Communications (URLLC). This was partly intended to get the attention of vertical industries to become new customers for mobile networks, which shall be deployed in their factories. With the allowance of federal agencies, companies deployed their own private 5G networks to test new use cases enabled by 5G. But what has been missing, apart from all the marketing, is the knowledge of what 5G can really do? Private 5G networks are envisioned to enable new use cases with strict latency requirements, such as robot control. This work has examined in great detail the capabilities of the current 5G Release 15 as private network, and in particular its suitability with regard to time-critical communications. For that, a testbed was designed to measure One-Way Delays (OWDs) and Round-Trip Times (RTTs) with high accuracy. The measurements were conducted in 5G Non-Standalone (NSA) and Standalone (SA) net-works and are the first published results. The evaluation revealed results that were not obvious or identified by previous work. For example, a strong impact of the packet rate on the resulting OWD and RTT was found. It was also found that typically 95% of the SA downlink end-to-end packet delays are in the range of 4 ms to 10 ms, indicating a fairly wide spread of packet delays, with the Inter-Packet Delay Variation (IPDV) between consecutive packets distributed in the millisecond range. Surprisingly, it also seems to matter for the RTT from which direction, i.e. Downlink (DL) or Uplink (UL), a round-trip communication was initiated. Another important factor plays especially the Inter-Arrival Time (IAT) of packets on the RTT distribution. These examples from the results found demonstrate the need to critically examine 5G and any successors in terms of their real-time capabilities. In addition to the end-to-end OWD and RTT, the delays caused by 4G and 5G Core processing has been investigated as well. Current state-of-the-art 4G and 5G Core implementations exhibit long-tailed delay distributions. To overcome such limitations, modern packet processing have been evaluated in terms of their respective tail-latency. The hardware-based solution was able to process packets with deterministic delay, but the software-based solutions also achieved soft real-time results. These results allow the selection of the right technology for use cases depending on their tail-latency requirements. In summary, many insights into the suitability of 5G for time-critical communications were gained from the study of the current 5G Release 15. The measurement framework, analysis methods, and results will inform the further development and refinement of private 5G campus networks for industrial use cases

    Automotive Ethernet architecture and security: challenges and technologies

    Get PDF
    Vehicle infrastructure must address the challenges posed by today's advances toward connected and autonomous vehicles. To allow for more flexible architectures, high-bandwidth connections and scalability are needed to connect many sensors and electronic control units (ECUs). At the same time, deterministic and low latency is a critical and significant design requirement to support urgent real-time applications in autonomous vehicles. As a recent solution, the time-sensitive network (TSN) was introduced as Ethernet-based amendments in IEEE 802.1 TSN standards to meet those needs. However, it had hurdle to be overcome before it can be used effectively. This paper discusses the latest studies concerning the automotive Ethernet requirements, including transmission delay studies to improve worst-case end-to-end delay and end-to-end jitter. Also, the paper focuses on the securing Ethernet-based in-vehicle networks (IVNs) by reviewing new encryption and authentication methods and approaches

    Task-oriented cross-system design for Metaverse in 6G era

    Get PDF
    As an emerging concept, the Metaverse has the potential to revolutionize social interaction in the post-pandemic era by establishing a digital world for online education, remote healthcare, immersive business, intelligent transportation, and advanced manufacturing. The goal is ambitious, yet the methodologies and technologies to achieve the full vision of the Metaverse remain unclear. In this thesis, we first introduce the three pillars of infrastructure that lay the foundation of the Metaverse, i.e., Human-Computer Interfaces (HCIs), sensing and communication systems, and network architectures. Then, we depict the roadmap towards the Metaverse that consists of four stages with different applications. As one of the essential building blocks for the Metaverse, we also review the state-of-the-art Computer Vision for the Metaverse as well as the future scope. To support diverse applications in the Metaverse, we put forward a novel design methodology: task-oriented cross-system design, and further review the potential solutions and future challenges. Specifically, we establish a task-oriented cross-system design for a simple case, where sampling, communications, and prediction modules are jointly optimized for the synchronization of the real-world devices and digital model in the Metaverse. We use domain knowledge to design a deep reinforcement learning (DRL) algorithm to minimize the communication load subject to an average tracking error constraint. We validate our framework on a prototype composed of a real-world robotic arm and its digital model. The results show that our framework achieves a better trade-off between the average tracking error and the average communication load compared to a communication system without sampling and prediction. For example, the average communication load can be reduced to 87% when the average track error constraint is 0.002◦ . In addition, our policy outperforms the benchmark with the static sampling rate and prediction horizon optimized by exhaustive search, in terms of the tail probability of the tracking error. Furthermore, with the assistance of expert knowledge, the proposed algorithm achieves a better convergence time, stability, communication load, and average tracking error. Furthermore, we establish a task-oriented cross-system design framework for a general case, where the goal is to minimize the required packet rate for timely and accurate modeling of a real-world robotic arm in the Metaverse. Specifically, different modules including sensing, communications, prediction, control, and rendering are considered. To optimize a scheduling policy and prediction horizons, we design a Constraint Proximal Policy Optimization (CPPO) algorithm by integrating domain knowledge from relevant systems into the advanced reinforcement learning algorithm, Proximal Policy Optimization (PPO). Specifically, the Jacobian matrix for analyzing the motion of the robotic arm is included in the state of the CPPO algorithm, and the Conditional Value-at-Risk (CVaR) of the state-value function characterizing the long-term modeling error is adopted in the constraint. Besides, the policy is represented by a two-branch neural network determining the scheduling policy and the prediction horizons, respectively. To evaluate our algorithm, we build a prototype including a real-world robotic arm and its digital model in the Metaverse. The experimental results indicate that domain knowledge helps to reduce the convergence time and the required packet rate by up to 50%, and the cross-system design framework outperforms a baseline framework in terms of the required packet rate and the tail distribution of the modeling error

    Modelling, Dimensioning and Optimization of 5G Communication Networks, Resources and Services

    Get PDF
    This reprint aims to collect state-of-the-art research contributions that address challenges in the emerging 5G networks design, dimensioning and optimization. Designing, dimensioning and optimization of communication networks resources and services have been an inseparable part of telecom network development. The latter must convey a large volume of traffic, providing service to traffic streams with highly differentiated requirements in terms of bit-rate and service time, required quality of service and quality of experience parameters. Such a communication infrastructure presents many important challenges, such as the study of necessary multi-layer cooperation, new protocols, performance evaluation of different network parts, low layer network design, network management and security issues, and new technologies in general, which will be discussed in this book
    corecore