7,015 research outputs found
A Review on Energy Consumption Optimization Techniques in IoT Based Smart Building Environments
In recent years, due to the unnecessary wastage of electrical energy in
residential buildings, the requirement of energy optimization and user comfort
has gained vital importance. In the literature, various techniques have been
proposed addressing the energy optimization problem. The goal of each technique
was to maintain a balance between user comfort and energy requirements such
that the user can achieve the desired comfort level with the minimum amount of
energy consumption. Researchers have addressed the issue with the help of
different optimization algorithms and variations in the parameters to reduce
energy consumption. To the best of our knowledge, this problem is not solved
yet due to its challenging nature. The gap in the literature is due to the
advancements in the technology and drawbacks of the optimization algorithms and
the introduction of different new optimization algorithms. Further, many newly
proposed optimization algorithms which have produced better accuracy on the
benchmark instances but have not been applied yet for the optimization of
energy consumption in smart homes. In this paper, we have carried out a
detailed literature review of the techniques used for the optimization of
energy consumption and scheduling in smart homes. The detailed discussion has
been carried out on different factors contributing towards thermal comfort,
visual comfort, and air quality comfort. We have also reviewed the fog and edge
computing techniques used in smart homes
Defuzzification Method for NP-Hard Problem in Cloud
The cloud computing is that the one that deals with the commerce of the resources with efficiency in accordance to the user’s would like. Employment programming is that the selection of a perfect resource for any job to be dead with reference to waiting time, value or turnaround. A cloud job programming are associate NP-hard downside that contains n jobs and m machines every} job is processed with each of those m machines to reduce the makespan. the safety here is one among the highest most considerations within the cloud. so as to calculate the worth of fitness the fuzzy abstract thought system makes use of the membership operate for crucial the degree up to that the input parameters that belong to each fuzzy set has relevancy. Here the fuzzy is employed for the aim of programming energy similarly as security within the cloud computing
Fuzzy logic-based algorithm resource scheduling for improving the reliability of cloud computing
Cloud computing is an important infrastructure for distributed systems with the main objective of reducing the use
of resources. In a cloud environment, users may face thousands of resources to run each task. However, allocation
of resources to tasks by the user is an impossible endeavor. Accurate scheduling of system resources results in
their optimal use as well as an increase in the reliability of cloud computing. This study designed a system based
on fuzzy logic and followed by an introduction of an efficient and precise algorithm for scheduling resources for
improving the reliability of cloud computing. Waiting and turnaround times of the proposed method were
compared to those of previous works. In the proposed method, the waiting time is equal to 26.99 and the turnaround
time is equal to 82.99. According to the results, the proposed method outperforms other methods in terms of
waiting time and turnaround time as well as accuracy
A Review on Computational Intelligence Techniques in Cloud and Edge Computing
Cloud computing (CC) is a centralized computing paradigm that accumulates resources centrally and provides these resources to users through Internet. Although CC holds a large number of resources, it may not be acceptable by real-time mobile applications, as it is usually far away from users geographically. On the other hand, edge computing (EC), which distributes resources to the network edge, enjoys increasing popularity in the applications with low-latency and high-reliability requirements. EC provides resources in a decentralized manner, which can respond to users’ requirements faster than the normal CC, but with limited computing capacities. As both CC and EC are resource-sensitive, several big issues arise, such as how to conduct job scheduling, resource allocation, and task offloading, which significantly influence the performance of the whole system. To tackle these issues, many optimization problems have been formulated. These optimization problems usually have complex properties, such as non-convexity and NP-hardness, which may not be addressed by the traditional convex optimization-based solutions. Computational intelligence (CI), consisting of a set of nature-inspired computational approaches, recently exhibits great potential in addressing these optimization problems in CC and EC. This article provides an overview of research problems in CC and EC and recent progresses in addressing them with the help of CI techniques. Informative discussions and future research trends are also presented, with the aim of offering insights to the readers and motivating new research directions
An Enhanced Model for Job Sequencing and Dispatch in Identical Parallel Machines
This paper has developed an efficient scheduling model that is robust and minimizes the total completion time for job completion in identical parallel machines. The new model employs Genetic-Fuzzy technique for job sequencing and dispatch in identical parallel machines. It uses genetic algorithm technique to develop a job scheduler that does the job sequencing and optimization while fuzzy logic technique was used to develop a job dispatcher that dispatches job to the identical parallel machines. The methodology used for the design is the Object Oriented Analysis and Design Methodology (OOADM) and the system was implemented using C# and .NET framework. The model was tested with fifteen identical parallel machines used for printing. The parameters used in analyzing this model include the job scheduling length, average execution time, load balancing and machines utilization. The result generated from the developed model was compare with the result of other job scheduling models like First Come First Sever (FCFS) scheduling approach and Genetic Model (GA) scheduling approach. The result of the new model shows a better load balancing and high machine utilization among the individual machines when compared with the First Come First Serve (FCFS) scheduling model and Genetic Algorithm (GA) scheduling model. Keywords: Parallel Machines, Genetic Model, Job Scheduler, Fuzzy Logic Technique, Load Balancing, Machines Utilization DOI: 10.7176/CEIS/11-2-05 Publication date: March 31st 202
Optimized Deep Learning Schemes for Secured Resource Allocation and Task Scheduling in Cloud Computing - A Survey
Scheduling involves allocating shared resources gradually so that tasks can be completed within a predetermined time frame. In Task Scheduling (TS) and Resource Allocation (RA), the phrase is used independently for tasks and resources. Scheduling is widely used for Cloud Computing (CC), computer science, and operational management. Effective scheduling ensures that systems operate efficiently, decisions are made effectively, resources are used efficiently, costs are kept to a minimum, and productivity is increased. High energy consumption, lower CPU utilization, time consumption, and low robustness are the most frequent problems in TS and RA in CC. In this survey, RA and TS based on deep learning (DL) and machine learning (ML) were discussed. Additionally, look into the methods employed by DL-based RA and TS-based CC. Additionally, the benefits, drawbacks, advantages, disadvantages, and merits are explored. The work's primary contribution is an analysis and assessment of DL-based RA and TS methodologies that pinpoint problems with cloud computing
- …