2 research outputs found
Algebraic and Analytic Approaches for Parameter Learning in Mixture Models
We present two different approaches for parameter learning in several mixture
models in one dimension. Our first approach uses complex-analytic methods and
applies to Gaussian mixtures with shared variance, binomial mixtures with
shared success probability, and Poisson mixtures, among others. An example
result is that samples suffice to exactly learn a mixture of
Poisson distributions, each with integral rate parameters bounded by .
Our second approach uses algebraic and combinatorial tools and applies to
binomial mixtures with shared trial parameter and differing success
parameters, as well as to mixtures of geometric distributions. Again, as an
example, for binomial mixtures with components and success parameters
discretized to resolution ,
samples suffice to exactly recover the parameters. For some of these
distributions, our results represent the first guarantees for parameter
estimation.Comment: 22 pages, Accepted at Algorithmic Learning Theory (ALT) 202
Recovery of Sparse Signals from a Mixture of Linear Samples
Mixture of linear regressions is a popular learning theoretic model that is
used widely to represent heterogeneous data. In the simplest form, this model
assumes that the labels are generated from either of two different linear
models and mixed together. Recent works of Yin et al. and Krishnamurthy et al.,
2019, focus on an experimental design setting of model recovery for this
problem. It is assumed that the features can be designed and queried with to
obtain their label. When queried, an oracle randomly selects one of the two
different sparse linear models and generates a label accordingly. How many such
oracle queries are needed to recover both of the models simultaneously? This
question can also be thought of as a generalization of the well-known
compressed sensing problem (Cand\`es and Tao, 2005, Donoho, 2006). In this
work, we address this query complexity problem and provide efficient algorithms
that improves on the previously best known results.Comment: International Conference on Machine Learning (ICML), 2020. (26 pages,
3 figures