190 research outputs found
Prescriptive PCA: Dimensionality Reduction for Two-stage Stochastic Optimization
In this paper, we consider the alignment between an upstream dimensionality
reduction task of learning a low-dimensional representation of a set of
high-dimensional data and a downstream optimization task of solving a
stochastic program parameterized by said representation. In this case, standard
dimensionality reduction methods (e.g., principal component analysis) may not
perform well, as they aim to maximize the amount of information retained in the
representation and do not generally reflect the importance of such information
in the downstream optimization problem. To address this problem, we develop a
prescriptive dimensionality reduction framework that aims to minimize the
degree of suboptimality in the optimization phase. For the case where the
downstream stochastic optimization problem has an expected value objective, we
show that prescriptive dimensionality reduction can be performed via solving a
distributionally-robust optimization problem, which admits a semidefinite
programming relaxation. Computational experiments based on a warehouse
transshipment problem and a vehicle repositioning problem show that our
approach significantly outperforms principal component analysis with real and
synthetic data sets
Optimized Dimensionality Reduction for Moment-based Distributionally Robust Optimization
Moment-based distributionally robust optimization (DRO) provides an
optimization framework to integrate statistical information with traditional
optimization approaches. Under this framework, one assumes that the underlying
joint distribution of random parameters runs in a distributional ambiguity set
constructed by moment information and makes decisions against the worst-case
distribution within the set. Although most moment-based DRO problems can be
reformulated as semidefinite programming (SDP) problems that can be solved in
polynomial time, solving high-dimensional SDPs is still time-consuming. Unlike
existing approximation approaches that first reduce the dimensionality of
random parameters and then solve the approximated SDPs, we propose an optimized
dimensionality reduction (ODR) approach. We first show that the ranks of the
matrices in the SDP reformulations are small, by which we are then motivated to
integrate the dimensionality reduction of random parameters with the subsequent
optimization problems. Such integration enables two outer and one inner
approximations of the original problem, all of which are low-dimensional SDPs
that can be solved efficiently. More importantly, these approximations can
theoretically achieve the optimal value of the original high-dimensional SDPs.
As these approximations are nonconvex SDPs, we develop modified Alternating
Direction Method of Multipliers (ADMM) algorithms to solve them efficiently. We
demonstrate the effectiveness of our proposed ODR approach and algorithm in
solving two practical problems. Numerical results show significant advantages
of our approach on the computational time and solution quality over the three
best possible benchmark approaches. Our approach can obtain an optimal or
near-optimal (mostly within 0.1%) solution and reduce the computational time by
up to three orders of magnitude
The impact of agricultural extension and roads on poverty and consumption growth in fifteen Ethiopian villages:
"This paper investigates whether public investments that led to improvements in road quality and increased access to agricultural extension services led to faster consumption growth and lower rates of poverty in rural Ethiopia. Estimating an instrumental variables model using Generalized Methods of Moments and controlling for household fixed effects, we find evidence of positive impacts with meaningful magnitudes. Receiving at least one extension visit reduces headcount poverty by 9.8 percentage points and increases consumption growth by 7.1 percent. Access to all-weather roads reduces poverty by 6.9 percentage points and increases consumption growth by 16.3 percent. These results are robust to changes in model specification and estimation methods." from authors' abstractPublic investment, roads, agricultural extension, income growth, Poverty,
International Conference on Continuous Optimization (ICCOPT) 2019 Conference Book
The Sixth International Conference on Continuous Optimization took place on the campus of the Technical University of Berlin, August 3-8, 2019. The ICCOPT is a flagship conference of the Mathematical Optimization Society (MOS), organized every three years. ICCOPT 2019 was hosted by the Weierstrass Institute for Applied Analysis and Stochastics (WIAS) Berlin. It included a Summer School and a Conference with a series of plenary and semi-plenary talks, organized and contributed sessions, and poster sessions.
This book comprises the full conference program. It contains, in particular, the scientific program in survey style as well as with all details, and information on the social program, the venue, special meetings, and more
OPTIMIZATION AND LEARNING UNDER UNCERTAINTY - A UNIFIED ROBUSTNESS PERSPECTIVE
Ph.DDOCTOR OF PHILOSOPH
Measurement of a Multidimentional Index of Globalization and its Impact on Income Inequality
globalization, income inequality, indices, principal component
Distributionally Robust Optimization: A Review
The concepts of risk-aversion, chance-constrained optimization, and robust
optimization have developed significantly over the last decade. Statistical
learning community has also witnessed a rapid theoretical and applied growth by
relying on these concepts. A modeling framework, called distributionally robust
optimization (DRO), has recently received significant attention in both the
operations research and statistical learning communities. This paper surveys
main concepts and contributions to DRO, and its relationships with robust
optimization, risk-aversion, chance-constrained optimization, and function
regularization
- …