1,217 research outputs found

    Efficient consensus algorithm for the accurate faulty node tracking with faster convergence rate in a distributed sensor network

    Get PDF
    This article was published in the Eurasip Journal on Wireless Communications and Networking [©2016 Published by Springer International Publishing.] and the definite version is available at: http://dx.doi.org/10.1186/s13638-016-0698-x . The article website is at:http://jwcn.eurasipjournals.springeropen.com/articles/10.1186/s13638-016-0698-xOne of the challenging issues in a distributed computing system is to reach on a decision with the presence of so many faulty nodes. These faulty nodes may update the wrong information, provide misleading results and may be nodes with the depleted battery power. Consensus algorithms help to reach on a decision even with the faulty nodes. Every correct node decides some values by a consensus algorithm. If all correct nodes propose the same value, then all the nodes decide on that. Every correct node must agree on the same value. Faulty nodes do not reach on the decision that correct nodes agreed on. Binary consensus algorithm and average consensus algorithm are the most widely used consensus algorithm in a distributed system. We apply binary consensus and average consensus algorithm in a distributed sensor network with the presence of some faulty nodes. We evaluate these algorithms for better convergence rate and error rate. © 2016, The Author(s).Publishe

    Time synchronization in wireless sensor networks

    Get PDF
    Time synchronization is basic requirements for various applications in wireless sensor network, e.g., event detection, speed estimating, environment monitoring, data aggregation, target tracking, scheduling and sensor nodes cooperation. Time synchronization is also helpful to save energy in WSN because it provides the possibility to set nodes into the sleeping mode. In wireless sensor networks all of above applications need that all sensor nodes have a common time reference. However, most existing time synchronization protocols are likely to deteriorate or even be destroyed when the WSNs attack by malicious intruders. The recently developed maximum and minimum consensus based time synchronization protocol (MMTS) is a promising alternative as it does not depend on any reference node or network topology. But MMTS is vulnerable to message manipulation attacks. In this thesis, we focus on how to defend the MMTS protocol in wireless sensor networks under message manipulation attacks. We investigate the impact of message manipulation attacks over MMTS. Then, a Secured Maximum and Minimum Consensus based Time Synchronization (SMMTS) protocol is proposed to detect and invalidate message manipulation attacks

    In-situ Data Analytics In Cyber-Physical Systems

    Get PDF
    Cyber-Physical System (CPS) is an engineered system in which sensing, networking, and computing are tightly coupled with the control of the physical entities. To enable security, scalability and resiliency, new data analytics methodologies are required for computing, monitoring and optimization in CPS. This work investigates the data analytics related challenges in CPS through two study cases: Smart Grid and Seismic Imaging System. For smart grid, this work provides a complete solution for system management based on novel in-situ data analytics designs. We first propose methodologies for two important tasks of power system monitoring: grid topology change and power-line outage detection. To address the issue of low measurement redundancy in topology identification, particularly in the low-level distribution network, we develop a maximum a posterior based mechanism, which is capable of embedding prior information on the breakers status to enhance the identification accuracy. In power-line outage detection, existing approaches suer from high computational complexity and security issues raised from centralized implementation. Instead, this work presents a distributed data analytics framework, which carries out in-network processing and invokes low computational complexity, requiring only simple matrix-vector multiplications. To complete the system functionality, we also propose a new power grid restoration strategy involving data analytics for topology reconfiguration and resource planning after faults or changes. In seismic imaging system, we develop several innovative in-situ seismic imaging schemes in which each sensor node computes the tomography based on its partial information and through gossip with local neighbors. The seismic data are generated in a distributed fashion originally. Dierent from the conventional approach involving data collection and then processing in order, our proposed in-situ data computing methodology is much more ecient. The underlying mechanisms avoid the bottleneck problem on bandwidth since all the data are processed distributed in nature and only limited decisional information is communicated. Furthermore, the proposed algorithms can deliver quicker insights than the state-of-arts in seismic imaging. Hence they are more promising solutions for real-time in-situ data analytics, which is highly demanded in disaster monitoring related applications. Through extensive experiments, we demonstrate that the proposed data computing methods are able to achieve near-optimal high quality seismic tomography, retain low communication cost, and provide real-time seismic data analytics

    Time Synchronization in Wireless Sensor Networks

    Get PDF

    High-level Information Fusion for Constrained SMC Methods and Applications

    Get PDF
    Information Fusion is a field that studies processes utilizing data from various input sources, and techniques exploiting this data to produce estimates and knowledge about objects and situations. On the other hand, human computation is a new and evolving research area that uses human intelligence to solve computational problems that are beyond the scope of existing artificial intelligence algorithms. In previous systems, humans' role was mostly restricted for analysing a finished fusion product; however, in the current systems the role of humans is an integral element in a distributed framework, where many tasks can be accomplished by either humans or machines. Moreover, some information can be provided only by humans not machines, because the observational capabilities and opportunities for traditional electronic (hard) sensors are limited. A source-reliability-adaptive distributed non-linear estimation method applicable to a number of distributed state estimation problems is proposed. The proposed method requires only local data exchange among neighbouring sensor nodes. It therefore provides enhanced reliability, scalability, and ease of deployment. In particular, by taking into account the estimation reliability of each sensor node at any point in time, it yields a more robust distributed estimation. To perform the Multi-Model Particle Filtering (MMPF) in an adaptive distributed manner, a Gaussian approximation of the particle cloud obtained at each sensor node, along with a weighted Consensus Propagation (CP)-based distributed data aggregation scheme, are deployed to dynamically re-weight the particle clouds. The filtering is a soft-data-constrained variant of multi-model particle filter, and is capable of processing both soft human-generated data and conventional hard sensory data. If permanent noise occurs in the estimation provided by a sensor node, due to either a faulty sensing device or misleading soft data, the contribution of that node in the weighted consensus process is immediately reduced in order to alleviate its effect on the estimation provided by the neighbouring nodes and the entire network. The robustness of the proposed source-reliability-adaptive distributed estimation method is demonstrated through simulation results for agile target tracking scenarios. Agility here refers to cases in which the observed dynamics of targets deviate from the given probabilistic characterization. Furthermore, the same concept is applied to model soft data constrained multiple-model Probability Hypothesis Density (PHD) filter that can track agile multiple targets with non-linear dynamics, which is a challenging problem. In this case, a Sequential Monte Carlo-Probability Hypothesis Density (SMC-PHD) filter deploys a Random Set (RS) theoretic formulation, along with Sequential Monte Carlo approximation, a variant of Bayes filtering. In general, the performance of Bayesian filtering-based methods can be enhanced by using extra information incorporated as specific constraints into the filtering process. Following the same principle, the new approach uses a constrained variant of the SMC-PHD filter, in which a fuzzy logic approach is used to transform the inherently vague human-generated data into a set of constraints. These constraints are then enforced on the filtering process by applying them as coefficients to the particles' weights. Because the human generated Soft Data (SD), reports on target-agility level, the proposed constrained-filtering approach is capable of dealing with multiple agile target tracking scenarios

    Statistical Estimation Framework for State Awareness in Microgrids Based on IoT Data Streams

    Get PDF
    This paper presents an event-triggered statistical estimation strategy and a data collection architecture for situational awareness (SA) in microgrids. An estimation agent structure based on the event-triggered Kalman filter is proposed and implemented for state estimation layer of the SA using long range wide area network (LoRAWAN) protocol. A setup has been developed which provides enormous data collection capabilities from smart meters in order to realize an adequate level of SA in microgrids. Thingsboard Internet of things (IoT) platform is used for the SA visualization with a customized dashboard. It is shown that by using the developed estimation strategy, an adequate level of SA can be achieved with a minimum installation and communication cost to have an accurate average state estimation of the microgrid
    corecore