666 research outputs found
Algorithm Engineering in Robust Optimization
Robust optimization is a young and emerging field of research having received
a considerable increase of interest over the last decade. In this paper, we
argue that the the algorithm engineering methodology fits very well to the
field of robust optimization and yields a rewarding new perspective on both the
current state of research and open research directions.
To this end we go through the algorithm engineering cycle of design and
analysis of concepts, development and implementation of algorithms, and
theoretical and experimental evaluation. We show that many ideas of algorithm
engineering have already been applied in publications on robust optimization.
Most work on robust optimization is devoted to analysis of the concepts and the
development of algorithms, some papers deal with the evaluation of a particular
concept in case studies, and work on comparison of concepts just starts. What
is still a drawback in many papers on robustness is the missing link to include
the results of the experiments again in the design
Equitable Efficiency in Multiple Criteria Optimization
Equitable efficiency in multiple criteria optimization was introduced mathematically in the middle of nineteen-nineties. The concept tends to strengthen the notion of Pareto efficiency by imposing additional conditions on the preference structure defining the Pareto preference. It is especially designed to solve multiple criteria problems having commensurate criteria where different criteria values can be compared directly. In this dissertation we study some theoretical and practical aspects of equitably efficient solutions. The literature on equitable efficiency is not very extensive and provides very limited number of ways of generating such solutions. After introducing some relevant notations, we develop some scalarization based methods of generating equitably efficient solutions. The scalarizations developed do not assume any special structure of the problem. We prove an existence result for linear multiple criteria problems. Next, we show how equitably efficient solutions arise in the context of a particular type of linear complementarity problem and matrix games. The set of equitably efficient solutions, in general, is a subset of efficient solutions. The multiple criteria alternative of the linear complementarity problem dealt in our dissertation has identical efficient and equitably efficient solution sets. Finally, we demonstrate the relevance of equitable efficiency by applying it to the problem of regression analysis and asset allocation
Risk, Security and Robust Solutions
The aim of this paper is to develop a decision-theoretic approach to security management of uncertain multi-agent systems. Security is defined as the ability to deal with intentional and unintentional threats generated by agents. The main concern of the paper is the protection of public goods from these threats allowing explicit treatment of inherent uncertainties and robust security management solutions. The paper shows that robust solutions can be properly designed by new stochastic optimization tools applicable for multicriteria problems with uncertain probability distributions and multivariate extreme events
X-ray CT Image Reconstruction on Highly-Parallel Architectures.
Model-based image reconstruction (MBIR) methods for X-ray CT use accurate
models of the CT acquisition process, the statistics of the noisy measurements,
and noise-reducing regularization to produce potentially higher quality images
than conventional methods even at reduced X-ray doses. They do this by
minimizing a statistically motivated high-dimensional cost function; the high
computational cost of numerically minimizing this function has prevented MBIR
methods from reaching ubiquity in the clinic. Modern highly-parallel hardware
like graphics processing units (GPUs) may offer the computational resources to
solve these reconstruction problems quickly, but simply "translating" existing
algorithms designed for conventional processors to the GPU may not fully
exploit the hardware's capabilities.
This thesis proposes GPU-specialized image denoising and image reconstruction
algorithms. The proposed image denoising algorithm uses group coordinate
descent with carefully structured groups. The algorithm converges very
rapidly: in one experiment, it denoises a 65 megapixel image in about 1.5
seconds, while the popular Chambolle-Pock primal-dual algorithm running on the
same hardware takes over a minute to reach the same level of accuracy.
For X-ray CT reconstruction, this thesis uses duality and group coordinate
ascent to propose an alternative to the popular ordered subsets (OS) method.
Similar to OS, the proposed method can use a subset of the data to update the
image. Unlike OS, the proposed method is convergent. In one helical CT
reconstruction experiment, an implementation of the proposed algorithm using
one GPU converges more quickly than a state-of-the-art algorithm converges
using four GPUs. Using four GPUs, the proposed algorithm reaches near
convergence of a wide-cone axial reconstruction problem with over 220 million
voxels in only 11 minutes.PhDElectrical Engineering: SystemsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/113551/1/mcgaffin_1.pd
Barrier-Based Test Synthesis for Safety-Critical Systems Subject to Timed Reach-Avoid Specifications
We propose an adversarial, time-varying test-synthesis procedure for
safety-critical systems without requiring specific knowledge of the underlying
controller steering the system. From a broader test and evaluation context,
determination of difficult tests of system behavior is important as these tests
would elucidate problematic system phenomena before these mistakes can engender
problematic outcomes, e.g. loss of human life in autonomous cars, costly
failures for airplane systems, etc. Our approach builds on existing,
simulation-based work in the test and evaluation literature by offering a
controller-agnostic test-synthesis procedure that provides a series of
benchmark tests with which to determine controller reliability. To achieve
this, our approach codifies the system objective as a timed reach-avoid
specification. Then, by coupling control barrier functions with this class of
specifications, we construct an instantaneous difficulty metric whose minimizer
corresponds to the most difficult test at that system state. We use this
instantaneous difficulty metric in a game-theoretic fashion, to produce an
adversarial, time-varying test-synthesis procedure that does not require
specific knowledge of the system's controller, but can still provably identify
realizable and maximally difficult tests of system behavior. Finally, we
develop this test-synthesis procedure for both continuous and discrete-time
systems and showcase our test-synthesis procedure on simulated and hardware
examples
Variational Multiscale Nonparametric Regression: Algorithms and Implementation
Many modern statistically efficient methods come with tremendous
computational challenges, often leading to large-scale optimisation problems.
In this work, we examine such computational issues for recently developed
estimation methods in nonparametric regression with a specific view on image
denoising. We consider in particular certain variational multiscale estimators
which are statistically optimal in minimax sense, yet computationally
intensive. Such an estimator is computed as the minimiser of a smoothness
functional (e.g., TV norm) over the class of all estimators such that none of
its coefficients with respect to a given multiscale dictionary is statistically
significant. The so obtained multiscale Nemirowski-Dantzig estimator (MIND) can
incorporate any convex smoothness functional and combine it with a proper
dictionary including wavelets, curvelets and shearlets. The computation of MIND
in general requires to solve a high-dimensional constrained convex optimisation
problem with a specific structure of the constraints induced by the statistical
multiscale testing criterion. To solve this explicitly, we discuss three
different algorithmic approaches: the Chambolle-Pock, ADMM and semismooth
Newton algorithms. Algorithmic details and an explicit implementation is
presented and the solutions are then compared numerically in a simulation study
and on various test images. We thereby recommend the Chambolle-Pock algorithm
in most cases for its fast convergence. We stress that our analysis can also be
transferred to signal recovery and other denoising problems to recover more
general objects whenever it is possible to borrow statistical strength from
data patches of similar object structure.Comment: Codes are available at https://github.com/housenli/MIN
- …