Mixture models can be found in a wide variety of statistical applications. However, undertaking statistical inference in mixture models, especially non-parametric mixture models, can be challenging. A general, or nonparametric, mixture model has effectively an infinite dimensional parameter space. In frequentist statistics, the maximum likelihood estimator with an infinite dimensional parameter may not be consistent or efficient in the sense that the Cramer-Rao bound is not attained even asymptotically. In Bayesian statistics, a prior on an infinite dimensional space is not well defined and can be highly informative even with large amounts of data.
In this thesis, we mainly consider mixture and mixed-effects models, when the mixing distribution is non-parametric. Following the dimensionality reduction idea in [Marriott, 2002], we propose a reparameterization-approximation framework with a complete orthonormal basis in a Hilbert space. The parameters in the reparameterized models are interpreted as the generalized moments of a mixing distribution. We consider different orthonormal bases, including the families of orthogonal polynomials and the eigenfunctions of positive self-adjoint integral operators. We also study the approximation errors of the truncation approximations of the reparameterized models in some special cases.
The generalized moments in the truncated approximations of the reparameterized models have a natural parameter space, called the generalized moment space. We study the geometric properties of the generalized moment space and obtain two important geometric properties: the positive representation and the gradient characterization. The positive representation reveals the identifiability of the mixing distribution by its generalized moments and provides an upper bound of the number of the support points of the mixing distribution. On the other hand, the gradient characterization provides the foundation of the class of gradient-based algorithms when the feasible set is the generalized moment space.
Next, we aim to fit a non-parametric mixture model by a set of generalized moment conditions, which are from the proposed reparameterization-approximation procedure. We propose a new estimation method, called the generalized method of moments for mixture models. The proposed estimation method involves minimizing a quadratic objective function over the generalized moment space. The proposed estimators can be easily computed through the gradient-based algorithms. We show the convergence rate of the mean squared error of the proposed estimators, as the sample size goes to infinity. Moreover, we design the quadratic objective function to ensure that the proposed estimators are robust to the outliers. Compared to the other existing estimation methods for mixture models, the GMM for mixture models is more computationally friendly and robust to outliers.
Lastly, we consider the hypothesis testing problem on the regression parameter in a mixed-effects model with univariate random effects. Through our new procedures, we obtain a series of estimating equations parameterized in the regression parameter and the generalized moments of the random-effects distribution. These parameters are estimated under the framework of the generalized method of moments. In the case that the number of the generalized moments diverges with the sample size and the dimension of the regression parameter is fixed, we compute the convergence rate of the generalized method of moments estimators for the mixed-effects models with univariate random effects. Since the regularity conditions in [Wilks, 1938] fail under our context, it is challenging to construct an asymptotically χ2 test statistic. We propose using ensemble inference, in which an asymptotically χ2 test statistic is constructed from a series of the estimators obtained from the generalized estimating equations with different working correlation matrices