1,013 research outputs found
Multilinear tensor regression for longitudinal relational data
A fundamental aspect of relational data, such as from a social network, is
the possibility of dependence among the relations. In particular, the relations
between members of one pair of nodes may have an effect on the relations
between members of another pair. This article develops a type of regression
model to estimate such effects in the context of longitudinal and multivariate
relational data, or other data that can be represented in the form of a tensor.
The model is based on a general multilinear tensor regression model, a special
case of which is a tensor autoregression model in which the tensor of relations
at one time point are parsimoniously regressed on relations from previous time
points. This is done via a separable, or Kronecker-structured, regression
parameter along with a separable covariance model. In the context of an
analysis of longitudinal multivariate relational data, it is shown how the
multilinear tensor regression model can represent patterns that often appear in
relational and network data, such as reciprocity and transitivity.Comment: Published at http://dx.doi.org/10.1214/15-AOAS839 in the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Modeling homophily and stochastic equivalence in symmetric relational data
This article discusses a latent variable model for inference and prediction
of symmetric relational data.
The model, based on the idea of the eigenvalue decomposition, represents the
relationship between two nodes as the weighted inner-product of node-specific
vectors of latent characteristics. This ``eigenmodel'' generalizes other
popular latent variable models, such as latent class and distance models: It is
shown mathematically that any latent class or distance model has a
representation as an eigenmodel, but not vice-versa. The practical implications
of this are examined in the context of three real datasets, for which the
eigenmodel has as good or better out-of-sample predictive performance than the
other two models.Comment: 12 pages, 4 figures, 1 tabl
Adaptive Higher-order Spectral Estimators
Many applications involve estimation of a signal matrix from a noisy data
matrix. In such cases, it has been observed that estimators that shrink or
truncate the singular values of the data matrix perform well when the signal
matrix has approximately low rank. In this article, we generalize this approach
to the estimation of a tensor of parameters from noisy tensor data. We develop
new classes of estimators that shrink or threshold the mode-specific singular
values from the higher-order singular value decomposition. These classes of
estimators are indexed by tuning parameters, which we adaptively choose from
the data by minimizing Stein's unbiased risk estimate. In particular, this
procedure provides a way to estimate the multilinear rank of the underlying
signal tensor. Using simulation studies under a variety of conditions, we show
that our estimators perform well when the mean tensor has approximately low
multilinear rank, and perform competitively when the signal tensor does not
have approximately low multilinear rank. We illustrate the use of these methods
in an application to multivariate relational data.Comment: 29 pages, 3 figure
- …