491 research outputs found
Effective criteria for specific identifiability of tensors and forms
In applications where the tensor rank decomposition arises, one often relies
on its identifiability properties for interpreting the individual rank-
terms appearing in the decomposition. Several criteria for identifiability have
been proposed in the literature, however few results exist on how frequently
they are satisfied. We propose to call a criterion effective if it is satisfied
on a dense, open subset of the smallest semi-algebraic set enclosing the set of
rank- tensors. We analyze the effectiveness of Kruskal's criterion when it
is combined with reshaping. It is proved that this criterion is effective for
both real and complex tensors in its entire range of applicability, which is
usually much smaller than the smallest typical rank. Our proof explains when
reshaping-based algorithms for computing tensor rank decompositions may be
expected to recover the decomposition. Specializing the analysis to symmetric
tensors or forms reveals that the reshaped Kruskal criterion may even be
effective up to the smallest typical rank for some third, fourth and sixth
order symmetric tensors of small dimension as well as for binary forms of
degree at least three. We extended this result to symmetric tensors by analyzing the Hilbert function, resulting in a
criterion for symmetric identifiability that is effective up to symmetric rank
, which is optimal.Comment: 31 pages, 2 Macaulay2 code
Estimating the Number of Components in a Mixture of Multilayer Perceptrons
BIC criterion is widely used by the neural-network community for model
selection tasks, although its convergence properties are not always
theoretically established. In this paper we will focus on estimating the number
of components in a mixture of multilayer perceptrons and proving the
convergence of the BIC criterion in this frame. The penalized
marginal-likelihood for mixture models and hidden Markov models introduced by
Keribin (2000) and, respectively, Gassiat (2002) is extended to mixtures of
multilayer perceptrons for which a penalized-likelihood criterion is proposed.
We prove its convergence under some hypothesis which involve essentially the
bracketing entropy of the generalized score-functions class and illustrate it
by some numerical examples
- …