27,929 research outputs found
Mixture of experts models for multilevel data: modelling framework and approximation theory
Multilevel data are prevalent in many real-world applications. However, it
remains an open research problem to identify and justify a class of models that
flexibly capture a wide range of multilevel data. Motivated by the versatility
of the mixture of experts (MoE) models in fitting regression data, in this
article we extend upon the MoE and study a class of mixed MoE (MMoE) models for
multilevel data. Under some regularity conditions, we prove that the MMoE is
dense in the space of any continuous mixed effects models in the sense of weak
convergence. As a result, the MMoE has a potential to accurately resemble
almost all characteristics inherited in multilevel data, including the marginal
distributions, dependence structures, regression links, random intercepts and
random slopes. In a particular case where the multilevel data is hierarchical,
we further show that a nested version of the MMoE universally approximates a
broad range of dependence structures of the random effects among different
factor levels
- …