14 research outputs found

    Fog Computing and Networking: Part 2

    Get PDF
    The articles in this special section focus on the deployment of fog computing in communication networks

    When Mobile Blockchain Meets Edge Computing

    Full text link
    Blockchain, as the backbone technology of the current popular Bitcoin digital currency, has become a promising decentralized data management framework. Although blockchain has been widely adopted in many applications, e.g., finance, healthcare, and logistics, its application in mobile services is still limited. This is due to the fact that blockchain users need to solve preset proof-of-work puzzles to add new data, i.e., a block, to the blockchain. Solving the proof-of-work, however, consumes substantial resources in terms of CPU time and energy, which is not suitable for resource-limited mobile devices. To facilitate blockchain applications in future mobile Internet of Things systems, multiple access mobile edge computing appears to be an auspicious solution to solve the proof-of-work puzzles for mobile users. We first introduce a novel concept of edge computing for mobile blockchain. Then, we introduce an economic approach for edge computing resource management. Moreover, a prototype of mobile edge computing enabled blockchain systems is presented with experimental results to justify the proposed concept.Comment: Accepted by IEEE Communications Magazin

    Cloud/fog computing resource management and pricing for blockchain networks

    Full text link
    The mining process in blockchain requires solving a proof-of-work puzzle, which is resource expensive to implement in mobile devices due to the high computing power and energy needed. In this paper, we, for the first time, consider edge computing as an enabler for mobile blockchain. In particular, we study edge computing resource management and pricing to support mobile blockchain applications in which the mining process of miners can be offloaded to an edge computing service provider. We formulate a two-stage Stackelberg game to jointly maximize the profit of the edge computing service provider and the individual utilities of the miners. In the first stage, the service provider sets the price of edge computing nodes. In the second stage, the miners decide on the service demand to purchase based on the observed prices. We apply the backward induction to analyze the sub-game perfect equilibrium in each stage for both uniform and discriminatory pricing schemes. For the uniform pricing where the same price is applied to all miners, the existence and uniqueness of Stackelberg equilibrium are validated by identifying the best response strategies of the miners. For the discriminatory pricing where the different prices are applied to different miners, the Stackelberg equilibrium is proved to exist and be unique by capitalizing on the Variational Inequality theory. Further, the real experimental results are employed to justify our proposed model.Comment: 16 pages, double-column version, accepted by IEEE Internet of Things Journa

    ROUTER:Fog Enabled Cloud based Intelligent Resource Management Approach for Smart Home IoT Devices

    Get PDF
    There is a growing requirement for Internet of Things (IoT) infrastructure to ensure low response time to provision latency-sensitive real-time applications such as health monitoring, disaster management, and smart homes. Fog computing offers a means to provide such requirements, via a virtualized intermediate layer to provide data, computation, storage, and networking services between Cloud datacenters and end users. A key element within such Fog computing environments is resource management. While there are existing resource manager in Fog computing, they only focus on a subset of parameters important to Fog resource management encompassing system response time, network bandwidth, energy consumption and latency. To date no existing Fog resource manager considers these parameters simultaneously for decision making, which in the context of smart homes will become increasingly key. In this paper, we propose a novel resource management technique (ROUTER) for fog-enabled Cloud computing environments, which leverages Particle Swarm Optimization to optimize simultaneously. The approach is validated within an IoT-based smart home automation scenario, and evaluated within iFogSim toolkit driven by empirical models within a small-scale smart home experiment. Results demonstrate our approach results a reduction of 12% network bandwidth, 10% response time, 14% latency and 12.35% in energy consumption

    epcAware: a game-based, energy, performance and cost efficient resource management technique for multi-access edge computing

    Get PDF
    The Internet of Things (IoT) is producing an extraordinary volume of data daily, and it is possible that the data may become useless while on its way to the cloud for analysis, due to longer distances and delays. Fog/edge computing is a new model for analyzing and acting on time-sensitive data (real-time applications) at the network edge, adjacent to where it is produced. The model sends only selected data to the cloud for analysis and long-term storage. Furthermore, cloud services provided by large companies such as Google, can also be localized to minimize the response time and increase service agility. This could be accomplished through deploying small-scale datacenters (reffered to by name as cloudlets) where essential, closer to customers (IoT devices) and connected to a centrealised cloud through networks - which form a multi-access edge cloud (MEC). The MEC setup involves three different parties, i.e. service providers (IaaS), application providers (SaaS), network providers (NaaS); which might have different goals, therefore, making resource management a defïŹcult job. In the literature, various resource management techniques have been suggested in the context of what kind of services should they host and how the available resources should be allocated to customers’ applications, particularly, if mobility is involved. However, the existing literature considers the resource management problem with respect to a single party. In this paper, we assume resource management with respect to all three parties i.e. IaaS, SaaS, NaaS; and suggest a game theoritic resource management technique that minimises infrastructure energy consumption and costs while ensuring applications performance. Our empirical evaluation, using real workload traces from Google’s cluster, suggests that our approach could reduce up to 11.95% energy consumption, and approximately 17.86% user costs with negligible loss in performance. Moreover, IaaS can reduce up to 20.27% energy bills and NaaS can increase their costs savings up to 18.52% as compared to other methods

    How to Place Your Apps in the Fog -- State of the Art and Open Challenges

    Full text link
    Fog computing aims at extending the Cloud towards the IoT so to achieve improved QoS and to empower latency-sensitive and bandwidth-hungry applications. The Fog calls for novel models and algorithms to distribute multi-service applications in such a way that data processing occurs wherever it is best-placed, based on both functional and non-functional requirements. This survey reviews the existing methodologies to solve the application placement problem in the Fog, while pursuing three main objectives. First, it offers a comprehensive overview on the currently employed algorithms, on the availability of open-source prototypes, and on the size of test use cases. Second, it classifies the literature based on the application and Fog infrastructure characteristics that are captured by available models, with a focus on the considered constraints and the optimised metrics. Finally, it identifies some open challenges in application placement in the Fog

    Demand Response Management in Smart Grid Networks: a Two-Stage Game-Theoretic Learning-Based Approach

    Get PDF
    In this diploma thesis, the combined problem of power company selection and Demand Response Management in a Smart Grid Network consisting of multiple power companies and multiple customers is studied via adopting a distributed learning and game-theoretic technique. Each power company is characterized by its reputation and competitiveness. The customers who act as learning automata select the most appropriate power company to be served, in terms of price and electricity needs’ fulfillment, via a distributed learning based mechanism. Given customers\u27 power company selection, the Demand Response Management problem is formulated as a two-stage game theoretic optimization framework, where at the first stage the optimal customers\u27 electricity consumption is determined and at the second stage the optimal power companies’ pricing is calculated. The output of the Demand Response Management problem feeds the learning system in order to build knowledge and conclude to the optimal power company selection. A two-stage Power Company learning selection and Demand Response Management (PC-DRM) iterative algorithm is proposed in order to realize the distributed learning power company selection and the two-stage distributed Demand Response Management framework. The performance of the proposed approach is evaluated via modeling and simulation and its superiority against other state of the art approaches is illustrated

    User Oriented Resource Management with Virtualization: A Hierarchical Game Approach

    Get PDF
    The explosive advancements in mobile Internet and Internet of Things challenge the network capacity and architecture. The ossification of wireless networks hinders the further evolution towards the fifth generation of mobile communication systems. Ultra-dense small cell networks are considered as a feasible way to meet high-capacity demands. Meanwhile, ultradense small cell network virtualization also exploits an insightful perspective for the evolution because of
    corecore