24 research outputs found
International Conference on Continuous Optimization (ICCOPT) 2019 Conference Book
The Sixth International Conference on Continuous Optimization took place on the campus of the Technical University of Berlin, August 3-8, 2019. The ICCOPT is a flagship conference of the Mathematical Optimization Society (MOS), organized every three years. ICCOPT 2019 was hosted by the Weierstrass Institute for Applied Analysis and Stochastics (WIAS) Berlin. It included a Summer School and a Conference with a series of plenary and semi-plenary talks, organized and contributed sessions, and poster sessions.
This book comprises the full conference program. It contains, in particular, the scientific program in survey style as well as with all details, and information on the social program, the venue, special meetings, and more
On parameter estimation with the Wasserstein distance
Statistical inference can be performed by minimizing, over the parameter
space, the Wasserstein distance between model distributions and the empirical
distribution of the data. We study asymptotic properties of such minimum
Wasserstein distance estimators, complementing results derived by Bassetti,
Bodini and Regazzini in 2006. In particular, our results cover the misspecified
setting, in which the data-generating process is not assumed to be part of the
family of distributions described by the model. Our results are motivated by
recent applications of minimum Wasserstein estimators to complex generative
models. We discuss some difficulties arising in the approximation of these
estimators and illustrate their behavior in several numerical experiments. Two
of our examples are taken from the literature on approximate Bayesian
computation and have likelihood functions that are not analytically tractable.
Two other examples involve misspecified models.Comment: 29 pages (+18 pages of appendices), 6 figures. To appear in
Information and Inference: A Journal of the IMA. A previous version of this
paper contained work on approximate Bayesian computation with the Wasserstein
distance, which can now be found at arxiv:1905.0374
Recommended from our members
Quantum meets optimization and machine learning
With the advent of the quantum era, what role the quantum computer will play in optimization and machine learning becomes a natural and salient question. The development of novel quantum computing techniques is essential to showcase the quantum advantage in these fields. At the same time, new findings in classical optimization and machine learning algorithms also have the potential to stimulate quantum computing research. In the dissertation, we explore the fascinating connections between quantum computing, optimization, and machine learning, paving the way for transformative advances in all three fields. First, on the quantum side, we present efficient quantum algorithms for fundamental problems in sampling, optimization, and quantum physics. Our results highlight the practical advantages of quantum computing in these fields. In addition, we introduce new approaches to quantum complexity theory for characterizing the quantum hardness of optimization and machine learning problems. Second, on the optimization side, we improve the efficiency of the state-of-the-art classical algorithms for solving semi-definite programming (SDP), Fourier sensing, dynamic distance estimation, etc. Our classical results are closely intertwined with quantum computing. Some of them serve as stepping stones to new quantum algorithms, while others are motivated by quantum applications or inspired by quantum techniques. Third, on the machine learning side, we develop fast classical and quantum algorithms for training over-parameterized neural networks with provable guarantees of convergence and generalization. Furthermore, we contribute to the security aspect of machine learning by theoretically investigating some potential approaches to (classically) protect private data in collaborative machine learning and to (quantumly) protect the copyright of machine learning models. Fourth, we investigate the concentration and discrepancy properties of hyperbolic polynomials and higher-order random walks, which could potentially be applied to quantum computing, optimization, and machine learning.Computer Science