8,752 research outputs found
Boosting Variational Inference: an Optimization Perspective
Variational inference is a popular technique to approximate a possibly
intractable Bayesian posterior with a more tractable one. Recently, boosting
variational inference has been proposed as a new paradigm to approximate the
posterior by a mixture of densities by greedily adding components to the
mixture. However, as is the case with many other variational inference
algorithms, its theoretical properties have not been studied. In the present
work, we study the convergence properties of this approach from a modern
optimization viewpoint by establishing connections to the classic Frank-Wolfe
algorithm. Our analyses yields novel theoretical insights regarding the
sufficient conditions for convergence, explicit rates, and algorithmic
simplifications. Since a lot of focus in previous works for variational
inference has been on tractability, our work is especially important as a much
needed attempt to bridge the gap between probabilistic models and their
corresponding theoretical properties
A difference boosting neural network for automated star-galaxy classification
In this paper we describe the use of a new artificial neural network, called
the difference boosting neural network (DBNN), for automated classification
problems in astronomical data analysis. We illustrate the capabilities of the
network by applying it to star galaxy classification using recently released,
deep imaging data. We have compared our results with classification made by the
widely used Source Extractor (SExtractor) package. We show that while the
performance of the DBNN in star-galaxy classification is comparable to that of
SExtractor, it has the advantage of significantly higher speed and flexibility
during training as well as classification.Comment: 9 pages, 1figure, 7 tables, accepted for publication in Astronomy and
Astrophysic
- …