13,718 research outputs found

    Enhancing the selection of a model-based clustering with external qualitative variables

    Get PDF
    In cluster analysis, it can be useful to interpret the partition built from the data in the light of external categorical variables which were not directly involved to cluster the data. An approach is proposed in the model-based clustering context to select a model and a number of clusters which both fit the data well and take advantage of the potential illustrative ability of the external variables. This approach makes use of the integrated joint likelihood of the data and the partitions at hand, namely the model-based partition and the partitions associated to the external variables. It is noteworthy that each mixture model is fitted by the maximum likelihood methodology to the data, excluding the external variables which are used to select a relevant mixture model only. Numerical experiments illustrate the promising behaviour of the derived criterion

    Flexible Mixture Modeling with the Polynomial Gaussian Cluster-Weighted Model

    Full text link
    In the mixture modeling frame, this paper presents the polynomial Gaussian cluster-weighted model (CWM). It extends the linear Gaussian CWM, for bivariate data, in a twofold way. Firstly, it allows for possible nonlinear dependencies in the mixture components by considering a polynomial regression. Secondly, it is not restricted to be used for model-based clustering only being contextualized in the most general model-based classification framework. Maximum likelihood parameter estimates are derived using the EM algorithm and model selection is carried out using the Bayesian information criterion (BIC) and the integrated completed likelihood (ICL). The paper also investigates the conditions under which the posterior probabilities of component-membership from a polynomial Gaussian CWM coincide with those of other well-established mixture-models which are related to it. With respect to these models, the polynomial Gaussian CWM has shown to give excellent clustering and classification results when applied to the artificial and real data considered in the paper

    Mixtures of Shifted Asymmetric Laplace Distributions

    Full text link
    A mixture of shifted asymmetric Laplace distributions is introduced and used for clustering and classification. A variant of the EM algorithm is developed for parameter estimation by exploiting the relationship with the general inverse Gaussian distribution. This approach is mathematically elegant and relatively computationally straightforward. Our novel mixture modelling approach is demonstrated on both simulated and real data to illustrate clustering and classification applications. In these analyses, our mixture of shifted asymmetric Laplace distributions performs favourably when compared to the popular Gaussian approach. This work, which marks an important step in the non-Gaussian model-based clustering and classification direction, concludes with discussion as well as suggestions for future work
    • …
    corecore