145,971 research outputs found
Multi-task Sparse Structure Learning With Gaussian Copula Models
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)Multi-task learning (MTL) aims to improve generalization performance by learning multiple related tasks simultaneously. While sometimes the underlying task relationship structure is known, often the structure needs to be estimated from data at hand. In this paper, we present a novel family of models for MTL, applicable to regression and classification problems, capable of learning the structure of tasks relationship. In particular, we consider a joint estimation problem of the tasks relationship structure and the individual task parameters, which is solved using alternating minimization. The task relationship revealed by structure learning is founded on recent advances in Gaussian graphical models endowed with sparse estimators of the precision (inverse covariance) matrix. An extension to include flexible Gaussian copula models that relaxes the Gaussian marginal assumption is also proposed. We illustrate the e ff ectiveness of the proposed model on a variety of synthetic and benchmark data sets for regression and classi fi cation. We also consider the problem of combining Earth System Model (ESM) outputs for better projections of future climate, with focus on projections of temperature by combining ESMs in South and North America, and show that the proposed model outperforms several existing methods for the problem.17NSF [IIS-1029711, IIS-0916750, IIS-0953274, CNS-1314560, IIS-1422557, CCF-1451986, IIS-1447566]NASA [NNX12AQ39A]IBMYahooCNPqCNPq, BrazilConselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq
Learning Sparse Sharing Architectures for Multiple Tasks
Most existing deep multi-task learning models are based on parameter sharing,
such as hard sharing, hierarchical sharing, and soft sharing. How choosing a
suitable sharing mechanism depends on the relations among the tasks, which is
not easy since it is difficult to understand the underlying shared factors
among these tasks. In this paper, we propose a novel parameter sharing
mechanism, named \emph{Sparse Sharing}. Given multiple tasks, our approach
automatically finds a sparse sharing structure. We start with an
over-parameterized base network, from which each task extracts a subnetwork.
The subnetworks of multiple tasks are partially overlapped and trained in
parallel. We show that both hard sharing and hierarchical sharing can be
formulated as particular instances of the sparse sharing framework. We conduct
extensive experiments on three sequence labeling tasks. Compared with
single-task models and three typical multi-task learning baselines, our
proposed approach achieves consistent improvement while requiring fewer
parameters.Comment: Accepted by AAAI 202
Localized Sparse Incomplete Multi-view Clustering
Incomplete multi-view clustering, which aims to solve the clustering problem
on the incomplete multi-view data with partial view missing, has received more
and more attention in recent years. Although numerous methods have been
developed, most of the methods either cannot flexibly handle the incomplete
multi-view data with arbitrary missing views or do not consider the negative
factor of information imbalance among views. Moreover, some methods do not
fully explore the local structure of all incomplete views. To tackle these
problems, this paper proposes a simple but effective method, named localized
sparse incomplete multi-view clustering (LSIMVC). Different from the existing
methods, LSIMVC intends to learn a sparse and structured consensus latent
representation from the incomplete multi-view data by optimizing a sparse
regularized and novel graph embedded multi-view matrix factorization model.
Specifically, in such a novel model based on the matrix factorization, a l1
norm based sparse constraint is introduced to obtain the sparse low-dimensional
individual representations and the sparse consensus representation. Moreover, a
novel local graph embedding term is introduced to learn the structured
consensus representation. Different from the existing works, our local graph
embedding term aggregates the graph embedding task and consensus representation
learning task into a concise term. Furthermore, to reduce the imbalance factor
of incomplete multi-view learning, an adaptive weighted learning scheme is
introduced to LSIMVC. Finally, an efficient optimization strategy is given to
solve the optimization problem of our proposed model. Comprehensive
experimental results performed on six incomplete multi-view databases verify
that the performance of our LSIMVC is superior to the state-of-the-art IMC
approaches. The code is available in https://github.com/justsmart/LSIMVC.Comment: Published in IEEE Transactions on Multimedia (TMM). The code is
available at Github https://github.com/justsmart/LSIMV
- …