24,284 research outputs found
Variable-sized uncertainty and inverse problems in robust optimization
In robust optimization, the general aim is to find a solution that performs well over a set of possible parameter outcomes, the so-called uncertainty set. In this paper, we assume that the uncertainty size is not fixed, and instead aim at finding a set of robust solutions that covers all possible uncertainty set outcomes. We refer to these problems as robust optimization with variable-sized uncertainty. We discuss how to construct smallest possible sets of min–max robust solutions and give bounds on their size. A special case of this perspective is to analyze for which uncertainty sets a nominal solution ceases to be a robust solution, which amounts to an inverse robust optimization problem. We consider this problem with a min–max regret objective and present mixed-integer linear programming formulations that can be applied to construct suitable uncertainty sets. Results on both variable-sized uncertainty and inverse problems are further supported with experimental data
Research and Education in Computational Science and Engineering
Over the past two decades the field of computational science and engineering
(CSE) has penetrated both basic and applied research in academia, industry, and
laboratories to advance discovery, optimize systems, support decision-makers,
and educate the scientific and engineering workforce. Informed by centuries of
theory and experiment, CSE performs computational experiments to answer
questions that neither theory nor experiment alone is equipped to answer. CSE
provides scientists and engineers of all persuasions with algorithmic
inventions and software systems that transcend disciplines and scales. Carried
on a wave of digital technology, CSE brings the power of parallelism to bear on
troves of data. Mathematics-based advanced computing has become a prevalent
means of discovery and innovation in essentially all areas of science,
engineering, technology, and society; and the CSE community is at the core of
this transformation. However, a combination of disruptive
developments---including the architectural complexity of extreme-scale
computing, the data revolution that engulfs the planet, and the specialization
required to follow the applications to new frontiers---is redefining the scope
and reach of the CSE endeavor. This report describes the rapid expansion of CSE
and the challenges to sustaining its bold advances. The report also presents
strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie
Distributed Robustness Analysis of Interconnected Uncertain Systems Using Chordal Decomposition
Large-scale interconnected uncertain systems commonly have large state and
uncertainty dimensions. Aside from the heavy computational cost of solving
centralized robust stability analysis techniques, privacy requirements in the
network can also introduce further issues. In this paper, we utilize IQC
analysis for analyzing large-scale interconnected uncertain systems and we
evade these issues by describing a decomposition scheme that is based on the
interconnection structure of the system. This scheme is based on the so-called
chordal decomposition and does not add any conservativeness to the analysis
approach. The decomposed problem can be solved using distributed computational
algorithms without the need for a centralized computational unit. We further
discuss the merits of the proposed analysis approach using a numerical
experiment.Comment: 3 figures. Submitted to the 19th IFAC world congres
Fast Gibbs sampling for high-dimensional Bayesian inversion
Solving ill-posed inverse problems by Bayesian inference has recently
attracted considerable attention. Compared to deterministic approaches, the
probabilistic representation of the solution by the posterior distribution can
be exploited to explore and quantify its uncertainties. In applications where
the inverse solution is subject to further analysis procedures, this can be a
significant advantage. Alongside theoretical progress, various new
computational techniques allow to sample very high dimensional posterior
distributions: In [Lucka2012], a Markov chain Monte Carlo (MCMC) posterior
sampler was developed for linear inverse problems with -type priors. In
this article, we extend this single component Gibbs-type sampler to a wide
range of priors used in Bayesian inversion, such as general priors
with additional hard constraints. Besides a fast computation of the
conditional, single component densities in an explicit, parameterized form, a
fast, robust and exact sampling from these one-dimensional densities is key to
obtain an efficient algorithm. We demonstrate that a generalization of slice
sampling can utilize their specific structure for this task and illustrate the
performance of the resulting slice-within-Gibbs samplers by different computed
examples. These new samplers allow us to perform sample-based Bayesian
inference in high-dimensional scenarios with certain priors for the first time,
including the inversion of computed tomography (CT) data with the popular
isotropic total variation (TV) prior.Comment: submitted to "Inverse Problems
Optimal Clustering under Uncertainty
Classical clustering algorithms typically either lack an underlying
probability framework to make them predictive or focus on parameter estimation
rather than defining and minimizing a notion of error. Recent work addresses
these issues by developing a probabilistic framework based on the theory of
random labeled point processes and characterizing a Bayes clusterer that
minimizes the number of misclustered points. The Bayes clusterer is analogous
to the Bayes classifier. Whereas determining a Bayes classifier requires full
knowledge of the feature-label distribution, deriving a Bayes clusterer
requires full knowledge of the point process. When uncertain of the point
process, one would like to find a robust clusterer that is optimal over the
uncertainty, just as one may find optimal robust classifiers with uncertain
feature-label distributions. Herein, we derive an optimal robust clusterer by
first finding an effective random point process that incorporates all
randomness within its own probabilistic structure and from which a Bayes
clusterer can be derived that provides an optimal robust clusterer relative to
the uncertainty. This is analogous to the use of effective class-conditional
distributions in robust classification. After evaluating the performance of
robust clusterers in synthetic mixtures of Gaussians models, we apply the
framework to granular imaging, where we make use of the asymptotic
granulometric moment theory for granular images to relate robust clustering
theory to the application.Comment: 19 pages, 5 eps figures, 1 tabl
- …