929 research outputs found

    Anticipatory Mobile Computing: A Survey of the State of the Art and Research Challenges

    Get PDF
    Today's mobile phones are far from mere communication devices they were ten years ago. Equipped with sophisticated sensors and advanced computing hardware, phones can be used to infer users' location, activity, social setting and more. As devices become increasingly intelligent, their capabilities evolve beyond inferring context to predicting it, and then reasoning and acting upon the predicted context. This article provides an overview of the current state of the art in mobile sensing and context prediction paving the way for full-fledged anticipatory mobile computing. We present a survey of phenomena that mobile phones can infer and predict, and offer a description of machine learning techniques used for such predictions. We then discuss proactive decision making and decision delivery via the user-device feedback loop. Finally, we discuss the challenges and opportunities of anticipatory mobile computing.Comment: 29 pages, 5 figure

    AI-Empowered Fog/Edge Resource Management for IoT Applications: A Comprehensive Review, Research Challenges and Future Perspectives

    Get PDF

    Dynamic Resource Allocation in Industrial Internet of Things (IIoT) using Machine Learning Approaches

    Get PDF
    In today's era of rapid smart equipment development and the Industrial Revolution, the application scenarios for Internet of Things (IoT) technology are expanding widely. The combination of IoT and industrial manufacturing systems gives rise to the Industrial IoT (IIoT). However, due to resource limitations such as computational units and battery capacity in IIoT devices (IIEs), it is crucial to execute computationally intensive tasks efficiently. The dynamic and continuous generation of tasks poses a significant challenge to managing the limited resources in the IIoT environment. This paper proposes a collaborative approach for optimal offloading and resource allocation of highly sensitive industrial IoT tasks. Firstly, the computation-intensive IIoT tasks are transformed into a directed acyclic graph. Then, task offloading is treated as an optimization problem, taking into account the models of processor resources and energy consumption for the offloading scheme. Lastly, a dynamic resource allocation approach is introduced to allocate computing resources to the edge-cloud server for the execution of computation-intensive tasks. The proposed joint offloading and scheduling (JOS) algorithm creates its DAG and prepare a offloading queue. This queue is designed using collaborative q-learning based reinforcement learning and allocate optimal resources to the JOS for execution of tasks present in offloading queue. For this machine learning approach is used to predict and allocate resources. The paper compares conventional and machine learning-based resource allocation methods. The machine learning approach performs better in terms of response time, delay, and energy consumption. The proposed algorithm shows that energy usage increases with task size, and response time increases with the number of users. Among the algorithms compared, JOS has the lowest waiting time, followed by DQN, while Q-learning performs the worst. Based on these findings, the paper recommends adopting the machine learning approach, specifically the JOS algorithm, for joint offloading and resource allocation
    corecore