4,989 research outputs found
A Search Strategy of Level-Based Flooding for the Internet of Things
This paper deals with the query problem in the Internet of Things (IoT).
Flooding is an important query strategy. However, original flooding is prone to
cause heavy network loads. To address this problem, we propose a variant of
flooding, called Level-Based Flooding (LBF). With LBF, the whole network is
divided into several levels according to the distances (i.e., hops) between the
sensor nodes and the sink node. The sink node knows the level information of
each node. Query packets are broadcast in the network according to the levels
of nodes. Upon receiving a query packet, sensor nodes decide how to process it
according to the percentage of neighbors that have processed it. When the
target node receives the query packet, it sends its data back to the sink node
via random walk. We show by extensive simulations that the performance of LBF
in terms of cost and latency is much better than that of original flooding, and
LBF can be used in IoT of different scales
Introducing a Novel Minimum Accuracy Concept for Predictive Mobility Management Schemes
In this paper, an analytical model for the minimum required accuracy for predictive methods is derived in terms of both handover (HO) delay and HO signaling cost. After that, the total HO delay and signaling costs are derived for the worst-case scenario (when the predictive process has the same performance as the conventional one), and simulations are conducted using a cellular environment to reveal the importance of the proposed minimum accuracy framework. In addition to this, three different predictors; Markov Chains, Artificial Neural Network (ANN) and an Improved ANN (IANN) are implemented and compared. The results indicate that under certain circumstances, the predictors can occasionally fall below the applicable level. Therefore, the proposed concept of minimum accuracy plays a vital role in determining this corresponding threshold
On the equivalence between the cell-based smoothed finite element method and the virtual element method
We revisit the cell-based smoothed finite element method (SFEM) for
quadrilateral elements and extend it to arbitrary polygons and polyhedrons in
2D and 3D, respectively. We highlight the similarity between the SFEM and the
virtual element method (VEM). Based on the VEM, we propose a new stabilization
approach to the SFEM when applied to arbitrary polygons and polyhedrons. The
accuracy and the convergence properties of the SFEM are studied with a few
benchmark problems in 2D and 3D linear elasticity. Later, the SFEM is combined
with the scaled boundary finite element method to problems involving
singularity within the framework of the linear elastic fracture mechanics in
2D
Evolving collective behavior in an artificial ecology
Collective behavior refers to coordinated group motion, common to many animals. The dynamics of a group can be seen as a distributed model, each āanimalā applying the same rule set. This study investigates the use of evolved sensory controllers to produce schooling behavior. A set of artificial creatures āliveā in an artificial world with hazards and food. Each creature has a simple artificial neural network brain that controls movement in different situations. A chromosome encodes the network structure and weights, which may be combined using artificial evolution with another chromosome, if a creature should choose to mate. Prey and predators coevolve without an explicit fitness function for schooling to produce sophisticated, nondeterministic, behavior. The work highlights the role of speciesā physiology in understanding behavior and the role of the environment in encouraging the development of sensory systems
Faster unfolding of communities: speeding up the Louvain algorithm
Many complex networks exhibit a modular structure of densely connected groups
of nodes. Usually, such a modular structure is uncovered by the optimization of
some quality function. Although flawed, modularity remains one of the most
popular quality functions. The Louvain algorithm was originally developed for
optimizing modularity, but has been applied to a variety of methods. As such,
speeding up the Louvain algorithm, enables the analysis of larger graphs in a
shorter time for various methods. We here suggest to consider moving nodes to a
random neighbor community, instead of the best neighbor community. Although
incredibly simple, it reduces the theoretical runtime complexity from
to in networks with a
clear community structure. In benchmark networks, it speeds up the algorithm
roughly 2-3 times, while in some real networks it even reaches 10 times faster
runtimes. This improvement is due to two factors: (1) a random neighbor is
likely to be in a "good" community; and (2) random neighbors are likely to be
hubs, helping the convergence. Finally, the performance gain only slightly
diminishes the quality, especially for modularity, thus providing a good
quality-performance ratio. However, these gains are less pronounced, or even
disappear, for some other measures such as significance or surprise
- ā¦