48 research outputs found

    Mobility-aware application scheduling in fog computing

    Get PDF
    Fog computing provides a distributed infrastructure at the edges of the network, resulting in low-latency access and faster response to application requests when compared to centralized clouds. With this new level of computing capacity introduced between users and the data center-based clouds, new forms of resource allocation and management can be developed to take advantage of the Fog infrastructure. A wide range of applications with different requirements run on end-user devices, and with the popularity of cloud computing many of them rely on remote processing or storage. As clouds are primarily delivered through centralized data centers, such remote processing/storage usually takes place at a single location that hosts user applications and data. The distributed capacity provided by Fog computing allows execution and storage to be performed at different locations. The combination of distributed capacity, the range and types of user applications, and the mobility of smart devices require resource management and scheduling strategies that takes into account these factors altogether. We analyze the scheduling problem in Fog computing, focusing on how user mobility can influence application performance and how three different scheduling policies, namely concurrent, FCFS, and delay-priority, can be used to improve execution based on application characteristics

    Security Issues in Service Model of Fog Computing Environment

    Get PDF
    Fog computing is an innovative way to expand the cloud platform by providing computing resources. The platform is a cloud that has the same data, management, storage and application features, but their origins are different because they are deployed to different locations. The platform system can retrieve a large amount, work in the field, be fully loaded, and mount on a variety of hardware devices. With this utility, Fog Framework is perfect for applications and critical moments. Fog computing is similar to cloud computing, but because of its variability, creates new security and privacy challenges that go beyond what is common for fog nodes. This paper aims to understand the impact of security problems and how to overcome them, and to provide future safety guidance for those responsible for building, upgrading and maintaining fog systems

    PWRR Algorithm for Video Streaming Process Using Fog Computing

    Get PDF
    يُعد بث الفيديو أكثر الوسائط شيوعًا التي يستخدمها الأشخاص على الإنترنت اليوم ويستهلك الكثير من عمليات نقل الإنترنت. يتم استخدام كمية هائلة من استخدام الإنترنت لتدفق الفيديو الذي ينفق ما يقرب من 70٪ من الإنترنت اليوم. ومع ذلك ، توجد قيود على الوسائط التفاعلية ممثلة في زيادة استخدام النطاق الترددي وتأخره ، مثل بث الفيديو المباشر الذي يتطلب الإرسال في الوقت الفعلي. تستخدم تقنيات حوسبة الضباب للتخفيف من حدة هذه المشكلات من خلال توفير استجابة عالية في الوقت الحقيقي وموارد حاسوبية قريبة من العميل عند حدود الشبكة ، والضباب عبارة عن طبقة وسيطة بين السحابة والمستخدم النهائي ، تقترح هذه الورقة خوارزمية Weighted Round Robin (PWRR) ذات الأولوية لجدولة عمليات التدفق في بنية الضباب لإعطاء استباقية لبث طلب فيديو مباشر لتقديم وقت استجابة أقل للغاية وتواصل في الوقت الفعلي. تعرض نتائج تجربة PWRR في البنية المقترحة لتدفق الفيديوعبر حوسبة الضباب ، تقليل وقت الاستجابة ونوعية جيدة لطلبات الفيديو المباشرة التي تم تحقيقها مع تغييرات النطاق الترددي بالاضافة الى تلبية كل طلبات الاخرى للزبائن في نفس الوقت.       The most popular medium that being used by people on the internet nowadays is video streaming.  Nevertheless, streaming a video consumes much of the internet traffics. The massive quantity of internet usage goes for video streaming that disburses nearly 70% of the internet. Some constraints of interactive media might be detached; such as augmented bandwidth usage and lateness. The need for real-time transmission of video streaming while live leads to employing of Fog computing technologies which is an intermediary layer between the cloud and end user. The latter technology has been introduced to alleviate those problems by providing high real-time response and computational resources near to the client at the network boundary. The present research paper proposes priority weighted round robin (PWRR) algorithm for streaming operations scheduling in the fog architecture. This will give preemptive for streaming live video request to be delivered in a very short response time and real-time communication. The results of experimenting the PWRR in the proposed architecture display a minimize latency and good quality of live video requests which has been achieved with bandwidth changes as well as meeting all other clients requests at the same tim

    PASHE: Privacy Aware Scheduling in a Heterogeneous Fog Environment

    Get PDF
    Fog computing extends the functionality of the traditional cloud data center (cdc) using micro data centers (mdcs) located at the edge of the network. These mdcs provide both computation and storage to applications. Their proximity to users makes them a viable option for executing jobs with tight deadlines and latency constraints. Moreover, it may be the case that these mdcs have diverse execution capacities, i.e. they have heterogeneous architectures. The implication for this is that tasks may have variable execution costs on different mdcs. We propose PASHE (Privacy Aware Scheduling in a Heterogeneous Fog Environment), an algorithm that schedules privacy constrained real-time jobs on heterogeneous mdcs and the cdc. Three categories of tasks have been considered: private, semi-private and public. Private tasks with tight deadlines are executed on the local mdc of users. Semi-private tasks with tight deadlines are executed on “preferred” remote mdcs. Public tasks with loose deadlines are sent to the cdc for execution. We also take account of user mobility across different mdcs. If the mobility pattern of users is predictable, PASHE reserves computation resources on remote mdcs for job execution. Simulation results show that PASHE offers superior performance versus other scheduling algorithms in a fog computing environment, taking account of mdc heterogeneity, user mobility and application security

    Scheduling the Execution of Tasks at the Edge

    Get PDF
    The Internet of Things provides a huge infrastructure where numerous devices produce, collect and process data. These data are the basis for offering analytics to support novel applications. The processing of huge volumes of data is a demanding process, thus, the power of Cloud is already utilized. However, latency, privacy and the drawbacks of this centralized approach became the motivation for the emerge of edge computing. In edge computing, data could be processed at the edge of the network; at the IoT nodes to deliver immediate results. Due to the limited resources of IoT nodes, it is not possible to have a high number of demanding tasks locally executed to support applications. In this paper, we propose a scheme for selecting the most significant tasks to be executed at the edge while the remaining are transferred into the Cloud. Our distributed scheme focuses on mobile IoT nodes and provides a decision making mechanism and an optimization module for delivering the tasks that will be executed locally. We take into consideration multiple characteristics of tasks and optimize the final decision. With our mechanism, IoT nodes can be adapted to, possibly, unknown environments evolving their decision making. We evaluate the proposed scheme through a high number of simulations and give numerical results

    Computation Offloading and Scheduling in Edge-Fog Cloud Computing

    Get PDF
    Resource allocation and task scheduling in the Cloud environment faces many challenges, such as time delay, energy consumption, and security. Also, executing computation tasks of mobile applications on mobile devices (MDs) requires a lot of resources, so they can offload to the Cloud. But Cloud is far from MDs and has challenges as high delay and power consumption. Edge computing with processing near the Internet of Things (IoT) devices have been able to reduce the delay to some extent, but the problem is distancing itself from the Cloud. The fog computing (FC), with the placement of sensors and Cloud, increase the speed and reduce the energy consumption. Thus, FC is suitable for IoT applications. In this article, we review the resource allocation and task scheduling methods in Cloud, Edge and Fog environments, such as traditional, heuristic, and meta-heuristics. We also categorize the researches related to task offloading in Mobile Cloud Computing (MCC), Mobile Edge Computing (MEC), and Mobile Fog Computing (MFC). Our categorization criteria include the issue, proposed strategy, objectives, framework, and test environment.
    corecore