1,944 research outputs found

    Efficient Machine-type Communication using Multi-metric Context-awareness for Cars used as Mobile Sensors in Upcoming 5G Networks

    Full text link
    Upcoming 5G-based communication networks will be confronted with huge increases in the amount of transmitted sensor data related to massive deployments of static and mobile Internet of Things (IoT) systems. Cars acting as mobile sensors will become important data sources for cloud-based applications like predictive maintenance and dynamic traffic forecast. Due to the limitation of available communication resources, it is expected that the grows in Machine-Type Communication (MTC) will cause severe interference with Human-to-human (H2H) communication. Consequently, more efficient transmission methods are highly required. In this paper, we present a probabilistic scheme for efficient transmission of vehicular sensor data which leverages favorable channel conditions and avoids transmissions when they are expected to be highly resource-consuming. Multiple variants of the proposed scheme are evaluated in comprehensive realworld experiments. Through machine learning based combination of multiple context metrics, the proposed scheme is able to achieve up to 164% higher average data rate values for sensor applications with soft deadline requirements compared to regular periodic transmission.Comment: Best Student Paper Awar

    Enabling 5G Edge Native Applications

    Get PDF

    SimTune: bridging the simulator reality gap for resource management in edge-cloud computing

    Get PDF
    Industries and services are undergoing an Internet of Things centric transformation globally, giving rise to an explosion of multi-modal data generated each second. This, with the requirement of low-latency result delivery, has led to the ubiquitous adoption of edge and cloud computing paradigms. Edge computing follows the data gravity principle, wherein the computational devices move closer to the end-users to minimize data transfer and communication times. However, large-scale computation has exacerbated the problem of efficient resource management in hybrid edge-cloud platforms. In this regard, data-driven models such as deep neural networks (DNNs) have gained popularity to give rise to the notion of edge intelligence. However, DNNs face significant problems of data saturation when fed volatile data. Data saturation is when providing more data does not translate to improvements in performance. To address this issue, prior work has leveraged coupled simulators that, akin to digital twins, generate out-of-distribution training data alleviating the data-saturation problem. However, simulators face the reality-gap problem, which is the inaccuracy in the emulation of real computational infrastructure due to the abstractions in such simulators. To combat this, we develop a framework, SimTune, that tackles this challenge by leveraging a low-fidelity surrogate model of the high-fidelity simulator to update the parameters of the latter, so to increase the simulation accuracy. This further helps co-simulated methods to generalize to edge-cloud configurations for which human encoded parameters are not known apriori. Experiments comparing SimTune against state-of-the-art data-driven resource management solutions on a real edge-cloud platform demonstrate that simulator tuning can improve quality of service metrics such as energy consumption and response time by up to 14.7% and 7.6% respectively
    corecore