40 research outputs found

    Non-intrusive load monitoring and its challenges in a NILM system framework

    Get PDF
    With the increasing of energy demand and electricity price, researchers gain more and more interest among the residential load monitoring. In order to feed back the individual appliance’s energy consumption instead of the whole-house energy consumption, non-intrusive load monitoring (NILM) is a good choice for residents to respond the time-of-use price and achieve electricity saving. In this paper, we discuss the system framework of NILM and analyse the challenges in every module. Besides, we study and compare the public datasets and accuracy metrics of non-intrusive load monitoring techniques

    Grid Routing: An Energy-Efficient Routing Protocol for WSNs with Single Mobile Sink

    Get PDF
    In a traditional wireless sensor network with static sinks, sensor nodes close to the sink run out of their batteries quicker than other nodes due to the increased data traffic towards the sink. These nodes with huge data traffic are easy to become hotspots. Therefore, such networks may prematurely collapse since the sink is unreachable for other remote nodes. To mitigate this problem, sink mobility is proposed, which provides load-balanced data delivery and uniform energy dissipation by shifting the hotspots. However, the latest location update of the mobile sink within the network introduces a high communication overhead. In this paper, we propose Grid Routing, an energy-efficient mobile sink routing protocol, which aims to decrease the advertisement overhead of the sinks position and balance local energy dissipation in a non-uniform network. Simulation results indicate that the Grid Routing shows better performance compared with existing work

    A survey on rainfall forecasting using artificial neural network

    Get PDF
    Rainfall has a great impact on agriculture and people’s daily travel, so accurate prediction of precipitation is well worth studying for researchers. Traditional methods like numerical weather prediction (NWP) models or statistical models can’t provide satisfied effect of rainfall forecasting because of nonlinear and dynamic characteristics of precipitation. However, artificial neural network (ANN) has an ability to obtain complicated nonlinear relationship between variables, which is suitable to predict precipitation. This paper mainly introduces background knowledge of ANN and several algorithms using neural network applied to precipitation prediction in recent years. It is proved that neural network can greatly improve the accuracy and efficiency of prediction

    CACA-UAN: a context-aware communication approach to efficient and reliable underwater acoustic sensor networks

    Get PDF
    Underwater Acoustic Sensor Networks (UANs) have emerged as a promising technology recently which can be applied in many areas such as military and civil, where the communication between devices is crucial and challenging due to the unique characteristics of underwater acoustic-based environment, such as high latency and low bandwidth. In this paper, context awareness is applied to the design of an underwater communication approach, called Context-Aware Communication Approach for a UAN (CACA-UAN), which aims to improve the overall performance of the underwater communication. According to the results, the proposed CACA-UAN can increase the efficiency and reliability of the underwater communication syste

    Outlier Detection of Time Series with A Novel Hybrid Method in Cloud Computing

    Get PDF
    In the wake of the development in science and technology, Cloud Computing has obtained more attention in different field. Meanwhile, outlier detection for data mining in Cloud Computing is playing more and more significant role in different research domains and massive research works have devoted to outlier detection, which includes distance-based, density-based and clustering-based outlier detection. However, the existing available methods spend high computation time. Therefore, the improved algorithm of outlier detection, which has higher performance to detect outlier is presented. In this paper, the proposed method, which is an improved spectral clustering algorithm (SKM++), is fit for handling outliers. Then, pruning data can reduce computational complexity and combine distance-based method Manhattan Distance (distm) to obtain outlier score. Finally, the method confirms the outlier by extreme analysis. This paper validates the presented method by experiments with a real collected data by sensors and comparison against the existing approaches, the experimental results turn out that our proposed method precedes the existing

    A method for electric load data verification and repair in home environment

    Get PDF
    Home energy management (HEM) and smart home have been popular among people; HEM collects and analyses the electric load data to make the power use safe, reliable, economical, efficient and environmentally friendly. Without the correct data, the correct decisions and plans would not be made, so the data quality is of the great importance. This paper focuses on the verification and repair of the electric load data in family environment. Due to the irregularity of modern people's life styles, this paper proposes a system of 'N + 1' framework to handle this properly. The system collects information of every appliance and the power bus to make them verify each other, so it can solve the stochastic uncertainty problem and verify if the data is correct or not to ensure the data quality. In the course of data upload, there are many factors like smart meter malfunctions, communication failures and so on which will cause some wrong data. To repair the wrong data, we proposes a method called LBboosting, which integrates two curve fitting methods. As the results show, the method has a better performance than up-to date methods

    An optimized Speculative Execution Strategy Based on Local Data Prediction in Heterogeneous Hadoop Environment

    Get PDF
    Hadoop is a famous parallel computing framework that is applied to process large-scale data, but there exists such a task in hadoop framework, which is called “Straggling task” and has a serious impact on Hadoop. Speculative execution (SE) is an effective way to deal with the “Straggling task” by monitoring the real-time rate of running tasks and back up the “Straggler” on another node to increase the opportunity of completing backup task ahead of original. There are many problems in the proposed SE strategies, such as “Straggling task” misjudgment, improper selection of backup nodes, which will result in inefficient implementation of SE. In this paper, we propose an optimized SE strategy based on local data prediction, it collects task execution information in real time and uses Local regression to predict remaining time of the current task, and selects the appropriate backup task node according to the actual requirements, at the same time, it uses the consumption and benefit model to maximizes the effectiveness of SE. Finally, the strategy is implemented in Hadoop-2.6.0, the experiment proves that the optimized strategy not only enhances the accuracy of selecting the “Straggler” task candidates, but also shows better performance in heterogeneous Hadoop environment

    An adaptive approach to better load balancing in a consumer-centric cloud environment

    Get PDF
    Pay-as-you-consume, as a new type of cloud computing paradigm, has become increasingly popular since a large number of cloud services are gradually opening up to consumers. It gives consumers a great convenience, where users no longer need to buy their hardware resources, but are confronted with how to deal effectively with data from the cloud. How to improve the performance of the cloud platform as a consumer-centric cloud computing model becomes a critical issue. Existing heterogeneous distributed computing systems provide efficient parallel and high fault tolerant and reliable services, due to its characteristics of managing largescale clusters. Though the latest cloud computing cluster meets the need for faster job execution, more effective use of computing resources is still a challenge. Presently proposed methods concentrated on improving the execution time of incoming jobs, e.g., shortening the MapReduce (MR) time. In this paper, an adaptive scheme is offered to achieve time and space efficiency in a heterogeneous cloud environment. A dynamic speculative execution strategy on real-time management of cluster resources is presented to optimize the execution time of Map phase, and a prediction model is used for fast prediction of task execution time. Combing the prediction model with a multi-objective optimization algorithm, an adaptive solution to optimize the performance of space-time is obtained. Experimental results depict that the proposed scheme can allocate tasks evenly and improve work efficiency in a heterogeneous cluster
    corecore