9,670 research outputs found
Infinite Multiple Membership Relational Modeling for Complex Networks
Learning latent structure in complex networks has become an important problem
fueled by many types of networked data originating from practically all fields
of science. In this paper, we propose a new non-parametric Bayesian
multiple-membership latent feature model for networks. Contrary to existing
multiple-membership models that scale quadratically in the number of vertices
the proposed model scales linearly in the number of links admitting
multiple-membership analysis in large scale networks. We demonstrate a
connection between the single membership relational model and multiple
membership models and show on "real" size benchmark network data that
accounting for multiple memberships improves the learning of latent structure
as measured by link prediction while explicitly accounting for multiple
membership result in a more compact representation of the latent structure of
networks.Comment: 8 pages, 4 figure
Non-parametric Bayesian modeling of complex networks
Modeling structure in complex networks using Bayesian non-parametrics makes
it possible to specify flexible model structures and infer the adequate model
complexity from the observed data. This paper provides a gentle introduction to
non-parametric Bayesian modeling of complex networks: Using an infinite mixture
model as running example we go through the steps of deriving the model as an
infinite limit of a finite parametric model, inferring the model parameters by
Markov chain Monte Carlo, and checking the model's fit and predictive
performance. We explain how advanced non-parametric models for complex networks
can be derived and point out relevant literature
Mixed membership stochastic blockmodels
Observations consisting of measurements on relationships for pairs of objects
arise in many settings, such as protein interaction and gene regulatory
networks, collections of author-recipient email, and social networks. Analyzing
such data with probabilisic models can be delicate because the simple
exchangeability assumptions underlying many boilerplate models no longer hold.
In this paper, we describe a latent variable model of such data called the
mixed membership stochastic blockmodel. This model extends blockmodels for
relational data to ones which capture mixed membership latent relational
structure, thus providing an object-specific low-dimensional representation. We
develop a general variational inference algorithm for fast approximate
posterior inference. We explore applications to social and protein interaction
networks.Comment: 46 pages, 14 figures, 3 table
- …