443 research outputs found

    Optimal Resource Allocation in Ultra-low Power Fog-computing SWIPT-based Networks

    Full text link
    In this paper, we consider a fog computing system consisting of a multi-antenna access point (AP), an ultra-low power (ULP) single antenna device and a fog server. The ULP device is assumed to be capable of both energy harvesting (EH) and information decoding (ID) using a time-switching simultaneous wireless information and power transfer (SWIPT) scheme. The ULP device deploys the harvested energy for ID and either local computing or offloading the computations to the fog server depending on which strategy is most energy efficient. In this scenario, we optimize the time slots devoted to EH, ID and local computation as well as the time slot and power required for the offloading to minimize the energy cost of the ULP device. Numerical results are provided to study the effectiveness of the optimized fog computing system and the relevant challenges

    Integrating mobile and cloud resources management using the cloud personal assistant

    Get PDF
    The mobile cloud computing model promises to address the resource limitations of mobile devices, but effectively implementing this model is difficult. Previous work on mobile cloud computing has required the user to have a continuous, high-quality connection to the cloud infrastructure. This is undesirable and possibly infeasible, as the energy required on the mobile device to maintain a connection, and transfer sizeable amounts of data is large; the bandwidth tends to be quite variable, and low on cellular networks. The cloud deployment itself needs to efficiently allocate scalable resources to the user as well. In this paper, we formulate the best practices for efficiently managing the resources required for the mobile cloud model, namely energy, bandwidth and cloud computing resources. These practices can be realised with our mobile cloud middleware project, featuring the Cloud Personal Assistant (CPA). We compare this with the other approaches in the area, to highlight the importance of minimising the usage of these resources, and therefore ensure successful adoption of the model by end users. Based on results from experiments performed with mobile devices, we develop a no-overhead decision model for task and data offloading to the CPA of a user, which provides efficient management of mobile cloud resources

    Offload decision models and the price of anarchy in mobile cloud application ecosystems

    Get PDF
    With the maturity of technologies, such as HTML5 and JavaScript, and with the increasing popularity of cross-platform frameworks, such as Apache Cordova, mobile cloud computing as a new design paradigm of mobile application developments is becoming increasingly more accessible to developers. Following this trend, future on-device mobile application ecosystems will not only comprise a mixture of native and remote applications, but also include multiple hybrid mobile cloud applications. The resource competition in such ecosystems and its impact over the performance of mobile cloud applications has not yet been studied. In this paper, we study this competition from a game theoretical perspective and examine how it affects the behavior of mobile cloud applications. Three offload decision models of cooperative and non-cooperative nature are constructed and their efficiency compared. We present an extension to the classic load balancing game to model the offload behaviors within a non-cooperative environment. Mixed-strategy Nash equilibria are derived for the non-cooperative offload game with complete information, which further quantifies the price of anarchy in such ecosystems. We present simulation results that demonstrate the differences between each decision model’s efficiency. Our modeling approach facilitates further research in the design of the offload decision engines of mobile cloud applications. Our extension to the classic load balancing game broadens its applicability to real-life applications
    • …
    corecore