55,258 research outputs found
Cell Detection by Functional Inverse Diffusion and Non-negative Group SparsityPart II: Proximal Optimization and Performance Evaluation
In this two-part paper, we present a novel framework and methodology to
analyze data from certain image-based biochemical assays, e.g., ELISPOT and
Fluorospot assays. In this second part, we focus on our algorithmic
contributions. We provide an algorithm for functional inverse diffusion that
solves the variational problem we posed in Part I. As part of the derivation of
this algorithm, we present the proximal operator for the non-negative
group-sparsity regularizer, which is a novel result that is of interest in
itself, also in comparison to previous results on the proximal operator of a
sum of functions. We then present a discretized approximated implementation of
our algorithm and evaluate it both in terms of operational cell-detection
metrics and in terms of distributional optimal-transport metrics.Comment: published, 16 page
A Convex Feature Learning Formulation for Latent Task Structure Discovery
This paper considers the multi-task learning problem and in the setting where
some relevant features could be shared across few related tasks. Most of the
existing methods assume the extent to which the given tasks are related or
share a common feature space to be known apriori. In real-world applications
however, it is desirable to automatically discover the groups of related tasks
that share a feature space. In this paper we aim at searching the exponentially
large space of all possible groups of tasks that may share a feature space. The
main contribution is a convex formulation that employs a graph-based
regularizer and simultaneously discovers few groups of related tasks, having
close-by task parameters, as well as the feature space shared within each
group. The regularizer encodes an important structure among the groups of tasks
leading to an efficient algorithm for solving it: if there is no feature space
under which a group of tasks has close-by task parameters, then there does not
exist such a feature space for any of its supersets. An efficient active set
algorithm that exploits this simplification and performs a clever search in the
exponentially large space is presented. The algorithm is guaranteed to solve
the proposed formulation (within some precision) in a time polynomial in the
number of groups of related tasks discovered. Empirical results on benchmark
datasets show that the proposed formulation achieves good generalization and
outperforms state-of-the-art multi-task learning algorithms in some cases.Comment: ICML201
Sparse approximation of multilinear problems with applications to kernel-based methods in UQ
We provide a framework for the sparse approximation of multilinear problems
and show that several problems in uncertainty quantification fit within this
framework. In these problems, the value of a multilinear map has to be
approximated using approximations of different accuracy and computational work
of the arguments of this map. We propose and analyze a generalized version of
Smolyak's algorithm, which provides sparse approximation formulas with
convergence rates that mitigate the curse of dimension that appears in
multilinear approximation problems with a large number of arguments. We apply
the general framework to response surface approximation and optimization under
uncertainty for parametric partial differential equations using kernel-based
approximation. The theoretical results are supplemented by numerical
experiments
- …