2 research outputs found

    Self-Paced Multi-Task Clustering

    Full text link
    Multi-task clustering (MTC) has attracted a lot of research attentions in machine learning due to its ability in utilizing the relationship among different tasks. Despite the success of traditional MTC models, they are either easy to stuck into local optima, or sensitive to outliers and noisy data. To alleviate these problems, we propose a novel self-paced multi-task clustering (SPMTC) paradigm. In detail, SPMTC progressively selects data examples to train a series of MTC models with increasing complexity, thus highly decreases the risk of trapping into poor local optima. Furthermore, to reduce the negative influence of outliers and noisy data, we design a soft version of SPMTC to further improve the clustering performance. The corresponding SPMTC framework can be easily solved by an alternating optimization method. The proposed model is guaranteed to converge and experiments on real data sets have demonstrated its promising results compared with state-of-the-art multi-task clustering methods

    Self-Paced Deep Regression Forests for Facial Age Estimation

    Full text link
    Facial age estimation is an important and challenging problem in computer vision. Existing approaches usually employ deep neural networks (DNNs) to fit the mapping from facial features to age, even though there exist some noisy and confusing samples. We argue that it is more desirable to distinguish noisy and confusing facial images from regular ones, and alleviate the interference arising from them. To this end, we propose self-paced deep regression forests (SP-DRFs) -- a gradual learning DNNs framework for age estimation. As the model is learned gradually, from simplicity to complexity, it tends to emphasize more on reliable samples and avoid bad local minima. Moreover, the proposed capped-likelihood function helps to exclude noisy samples in training, rendering our SP-DRFs significantly more robust. We demonstrate the efficacy of SP-DRFs on Morph II and FG-NET datasets, where our model achieves state-of-the-art performance.Comment: 7 pages, 5 figures, 2 table
    corecore