491 research outputs found

    Energy-efficient deployment of edge dataenters for mobile clouds in sustainable iot

    Full text link
    © 2013 IEEE. Achieving quick responses with limited energy consumption in mobile cloud computing is an active area of research. The energy consumption increases when a user's request (task) runs in the local mobile device instead of executing in the cloud. Whereas, latency become an issue when the task executes in the cloud environment instead of the mobile device. Therefore, a tradeoff between energy consumption and latency is required in building sustainable Internet of Things (IoT), and for that, we have introduced a middle layer named an edge computing layer to avoid latency in IoT. There are several real-time applications, such as smart city and smart health, where mobile users upload their tasks into the cloud or execute locally. We have intended to minimize the energy consumption of a mobile device as well as the energy consumption of the cloud system while meeting a task's deadline, by offloading the task to the edge datacenter or cloud. This paper proposes an adaptive technique to optimize both parameters, i.e., energy consumption and latency by offloading the task and also by selecting the appropriate virtual machine for the execution of the task. In the proposed technique, if the specified edge datacenter is unable to provide resources, then the user's request will be sent to the cloud system. Finally, the proposed technique is evaluated using a real-world scenario to measure its performance and efficiency. The simulation results show that the total energy consumption and execution time decrease after introducing an edge datacenters as a middle layer

    Edge and Central Cloud Computing: A Perfect Pairing for High Energy Efficiency and Low-latency

    Get PDF
    In this paper, we study the coexistence and synergy between edge and central cloud computing in a heterogeneous cellular network (HetNet), which contains a multi-antenna macro base station (MBS), multiple multi-antenna small base stations (SBSs) and multiple single-antenna user equipment (UEs). The SBSs are empowered by edge clouds offering limited computing services for UEs, whereas the MBS provides high-performance central cloud computing services to UEs via a restricted multiple-input multiple-output (MIMO) backhaul to their associated SBSs. With processing latency constraints at the central and edge networks, we aim to minimize the system energy consumption used for task offloading and computation. The problem is formulated by jointly optimizing the cloud selection, the UEs' transmit powers, the SBSs' receive beamformers, and the SBSs' transmit covariance matrices, which is {a mixed-integer and non-convex optimization problem}. Based on methods such as decomposition approach and successive pseudoconvex approach, a tractable solution is proposed via an iterative algorithm. The simulation results show that our proposed solution can achieve great performance gain over conventional schemes using edge or central cloud alone. Also, with large-scale antennas at the MBS, the massive MIMO backhaul can significantly reduce the complexity of the proposed algorithm and obtain even better performance.Comment: Accepted in IEEE Transactions on Wireless Communication

    A survey on intelligent computation offloading and pricing strategy in UAV-Enabled MEC network: Challenges and research directions

    Get PDF
    The lack of resource constraints for edge servers makes it difficult to simultaneously perform a large number of Mobile Devices’ (MDs) requests. The Mobile Network Operator (MNO) must then select how to delegate MD queries to its Mobile Edge Computing (MEC) server in order to maximize the overall benefit of admitted requests with varying latency needs. Unmanned Aerial Vehicles (UAVs) and Artificial Intelligent (AI) can increase MNO performance because of their flexibility in deployment, high mobility of UAV, and efficiency of AI algorithms. There is a trade-off between the cost incurred by the MD and the profit received by the MNO. Intelligent computing offloading to UAV-enabled MEC, on the other hand, is a promising way to bridge the gap between MDs' limited processing resources, as well as the intelligent algorithms that are utilized for computation offloading in the UAV-MEC network and the high computing demands of upcoming applications. This study looks at some of the research on the benefits of computation offloading process in the UAV-MEC network, as well as the intelligent models that are utilized for computation offloading in the UAV-MEC network. In addition, this article examines several intelligent pricing techniques in different structures in the UAV-MEC network. Finally, this work highlights some important open research issues and future research directions of Artificial Intelligent (AI) in computation offloading and applying intelligent pricing strategies in the UAV-MEC network

    Proactive content caching in future generation communication networks: Energy and security considerations

    Get PDF
    The proliferation of hand-held devices and Internet of Things (IoT) applications has heightened demand for popular content download. A high volume of content streaming/downloading services during peak hours can cause network congestion. Proactive content caching has emerged as a prospective solution to tackle this congestion problem. In proactive content caching, data storage units are used to store popular content in helper nodes at the network edge. This contributes to a reduction of peak traffic load and network congestion. However, data storage units require additional energy, which offers a challenge to researchers that intend to reduce energy consumption up to 90% in next generation networks. This thesis presents proactive content caching techniques to reduce grid energy consumption by utilizing renewable energy sources to power-up data storage units in helper nodes. The integration of renewable energy sources with proactive caching is a significant challenge due to the intermittent nature of renewable energy sources and investment costs. In this thesis, this challenge is tackled by introducing strategies to determine the optimal time of the day for content caching and optimal scheduling of caching nodes. The proposed strategies consider not only the availability of renewable energy but also temporal changes in network trac to reduce associated energy costs. While proactive caching can facilitate the reduction of peak trac load and the integration of renewable energy, cached content objects at helper nodes are often more vulnerable to malicious attacks due to less stringent security at edge nodes. Potential content leakage can lead to catastrophic consequences, particularly for cache-equipped Industrial Internet of Things (IIoT) applications. In this thesis, the concept of \trusted caching nodes (TCNs) is introduced. TCNs cache popular content objects and provide security services to connected links. The proposed study optimally allocates TCNs and selects the most suitable content forwarding paths. Furthermore, a caching strategy is designed for mobile edge computing systems to support IoT task offloading. The strategy optimally assigns security resources to offloaded tasks while satisfying their individual requirements. However, security measures often contribute to overheads in terms of both energy consumption and delay. Consequently, in this thesis, caching techniques have been designed to investigate the trade-off between energy consumption and probable security breaches. Overall, this thesis contributes to the current literature by simultaneously investigating energy and security aspects of caching systems whilst introducing solutions to relevant research problems
    • …
    corecore