46 research outputs found
Riesz measures and Wishart laws associated to quadratic maps
We introduce a natural definition of Riesz measures and Wishart laws
associated to an -positive (virtual) quadratic map, where is a regular open convex cone. We give a general formula for
moments of the Wishart laws. Moreover, if the quadratic map has an equivariance
property under the action of a linear group acting on the cone
transitively, then the associated Riesz measure and Wishart law are described
explicitly by making use of theory of relatively invariant distributions on
homogeneous cones
Wishart laws and variance function on homogeneous cones
We present a systematic study of Riesz measures and their natural exponential
families of Wishart laws on a homogeneous cone. We compute explicitly the
inverse of the mean map and the variance function of a Wishart exponential
family.Comment: 24 pages, Probab. Math. Statist (2019
Joint Group Invariant Functions on Data-Parameter Domain Induce Universal Neural Networks
The symmetry and geometry of input data are considered to be encoded in the
internal data representation inside the neural network, but the specific
encoding rule has been less investigated. In this study, we present a
systematic method to induce a generalized neural network and its right inverse
operator, called the ridgelet transform, from a joint group invariant function
on the data-parameter domain. Since the ridgelet transform is an inverse, (1)
it can describe the arrangement of parameters for the network to represent a
target function, which is understood as the encoding rule, and (2) it implies
the universality of the network. Based on the group representation theory, we
present a new simple proof of the universality by using Schur's lemma in a
unified manner covering a wide class of networks, for example, the original
ridgelet transform, formal deep networks, and the dual voice transform. Since
traditional universality theorems were demonstrated based on functional
analysis, this study sheds light on the group theoretic aspect of the
approximation theory, connecting geometric deep learning to abstract harmonic
analysis.Comment: NeurReps 202
Model selection in the space of Gaussian models invariant by symmetry
We consider multivariate centred Gaussian models for the random variable
, invariant under the action of a subgroup of the group of
permutations on . Using the representation theory of the
symmetric group on the field of reals, we derive the distribution of the
maximum likelihood estimate of the covariance parameter and also the
analytic expression of the normalizing constant of the Diaconis-Ylvisaker
conjugate prior for the precision parameter . We can thus
perform Bayesian model selection in the class of complete Gaussian models
invariant by the action of a subgroup of the symmetric group, which we could
also call complete RCOP models. We illustrate our results with a toy example of
dimension and several examples for selection within cyclic groups,
including a high dimensional example with .Comment: 34 pages, 4 figures, 5 table