160,756 research outputs found
Stability conditions for a decentralised medium access algorithm: single- and multi-hop networks
We consider a decentralised multi-access algorithm, motivated primarily by
the control of transmissions in a wireless network. For a finite single-hop
network with arbitrary interference constraints we prove stochastic stability
under the natural conditions. For infinite and finite single-hop networks, we
obtain broad rate-stability conditions. We also consider symmetric finite
multi-hop networks and show that the natural condition is sufficient for
stochastic stability
Reconstructing dynamical networks via feature ranking
Empirical data on real complex systems are becoming increasingly available.
Parallel to this is the need for new methods of reconstructing (inferring) the
topology of networks from time-resolved observations of their node-dynamics.
The methods based on physical insights often rely on strong assumptions about
the properties and dynamics of the scrutinized network. Here, we use the
insights from machine learning to design a new method of network reconstruction
that essentially makes no such assumptions. Specifically, we interpret the
available trajectories (data) as features, and use two independent feature
ranking approaches -- Random forest and RReliefF -- to rank the importance of
each node for predicting the value of each other node, which yields the
reconstructed adjacency matrix. We show that our method is fairly robust to
coupling strength, system size, trajectory length and noise. We also find that
the reconstruction quality strongly depends on the dynamical regime
MIHash: Online Hashing with Mutual Information
Learning-based hashing methods are widely used for nearest neighbor
retrieval, and recently, online hashing methods have demonstrated good
performance-complexity trade-offs by learning hash functions from streaming
data. In this paper, we first address a key challenge for online hashing: the
binary codes for indexed data must be recomputed to keep pace with updates to
the hash functions. We propose an efficient quality measure for hash functions,
based on an information-theoretic quantity, mutual information, and use it
successfully as a criterion to eliminate unnecessary hash table updates. Next,
we also show how to optimize the mutual information objective using stochastic
gradient descent. We thus develop a novel hashing method, MIHash, that can be
used in both online and batch settings. Experiments on image retrieval
benchmarks (including a 2.5M image dataset) confirm the effectiveness of our
formulation, both in reducing hash table recomputations and in learning
high-quality hash functions.Comment: International Conference on Computer Vision (ICCV), 201
- …