402 research outputs found
Split Distributed Computing in Wireless Sensor Networks
We designed a novel method intended to improve the performance of distributed computing in wireless sensor networks. Our proposed method is designed to rapidly increase the speed of distributed computing and decrease the number of the messages required for a network to achieve the desired result. In our analysis, we chose Average consensus algorithm. In this case, the desired result is that every node achieves the average value calculated from all the initial values in the reduced number of iterations. Our method is based on the idea that a fragmentation of a network into small geographical structures which execute distributed calculations in parallel significantly affects the performance
Connectivity-Based Self-Localization in WSNs
Efficient localization methods are among the major challenges in wireless sensor networks today. In this paper, we present our so-called connectivity based approach i.e, based on local connectivity information, to tackle this problem. At first the method fragments the network into larger groups labeled as packs. Based on the mutual connectivity relations with their surrounding packs, we identify border nodes as well as the central node. As this first approach requires some a-priori knowledge on the network topology, we also present a novel segment-based fragmentation method to estimate the central pack of the network as well as detecting so-called corner packs without any a-priori knowledge. Based on these detected points, the network is fragmented into a set of even larger elements, so-called segments built on top of the packs, supporting even more localization information as they all reach the central node
The Distributed Convergence Classifier Using the Finite Difference
The paper presents a novel distributed classifier of the convergence, which allows to detect the convergence/the divergence of a distributed converging algorithm. Since this classifier is supposed to be primarily applied in wireless sensor networks, its proposal makes provision for the character of these networks. The classifier is based on the mechanism of comparison of the forward finite differences from two consequent iterations. The convergence/the divergence is classifiable only in terms of the changes of the inner states of a particular node and therefore, no message redundancy is required for its proper functionality
Impact of Message Losses on Push-Sum Protocol in Chosen Topologies
In this paper, we examine the natural robustness of the push-sum protocol to message losses in a tree, a star, a mesh, a ring and a link topology. We experimentally verify the impact of this failure on the character of the estimations, the deviation of the final estimation from the real value and the impact on the change of the convergence rate
The Analysis Of The Push-sum Protocol In Various Distributed Systems
In this paper, we have focused on an analysis of the push-sum protocol in various topologies. We analyzed the behavior of distributed systems forming a tree, a star, a ring and a fully-connected mesh topology. We also examined the influence of stochastic features of the push-sum protocol on the properties of this protocol and the convergence rates in the particular topologies
Evaluation of Natural Robustness of Best Constant Weights to Random Communication Breakdowns
One of the most crucial aspects of an algorithmdesign for the wireless sensors networks is the failure tolerance.A high natural robustness and an effectively bounded executiontime are factors that can significantly optimize the overall energyconsumption and therefore, a great emphasis is laid on theseaspects in many applications from the area of the wireless sensornetworks. This paper addresses the robustness of the optimizedBest Constant weights of Average Consensus with a stoppingcriterion (i.e. the algorithm is executed in a finite time) and theirfive variations with a lower mixing parameter (i.e. slowervariants) to random communication breakdowns modeled as a stochastic event of a Bernoulli distribution. We choose threemetrics, namely the deviation of the least precise final estimatesfrom the average, the convergence rate expressed as the numberof the iterations for the consensus, and the deceleration of eachinitial setup, in order to evaluate the robustness of various initialsetups of Best Constant weights under a varying failureprobability and over 30 random geometric graphs of either astrong or a weak connectivity. Our contribution is to find themost robust initial setup of Best Constant weights according tonumerical experiments executed in Matlab. Finally, theexperimentally obtained results are discussed, compared to theresults from the error-free executions, and our conclusions arecompared with the conclusions from related papers
A MESSAGE FAILURE ANALYSIS OF SYSTEMS EXECUTING AVERAGE CONSENSUS ALGORITHM
A communication failure is an aspect which may affect a whole system so significantly that it is unable to provide its functionality any longer. In this paper, we have implemented average consensus algorithm into 30 distributed systems and focused on examining the effect of a message delivery failure modeled by Bernoulli distribution. We modified the probability of a failure occurrence and examined the effect of these changes on the number of the iterations necessary for a distributed system to achieve the consensus and the deviation of the final values from the expected ones
- âŠ