127 research outputs found
Practical Volume Estimation by a New Annealing Schedule for Cooling Convex Bodies
We study the problem of estimating the volume of convex polytopes, focusing
on H- and V-polytopes, as well as zonotopes. Although a lot of effort is
devoted to practical algorithms for H-polytopes there is no such method for the
latter two representations. We propose a new, practical algorithm for all
representations, which is faster than existing methods. It relies on
Hit-and-Run sampling, and combines a new simulated annealing method with the
Multiphase Monte Carlo (MMC) approach. Our method introduces the following key
features to make it adaptive: (a) It defines a sequence of convex bodies in MMC
by introducing a new annealing schedule, whose length is shorter than in
previous methods with high probability, and the need of computing an enclosing
and an inscribed ball is removed; (b) It exploits statistical properties in
rejection-sampling and proposes a better empirical convergence criterion for
specifying each step; (c) For zonotopes, it may use a sequence of convex bodies
for MMC different than balls, where the chosen body adapts to the input. We
offer an open-source, optimized C++ implementation, and analyze its performance
to show that it outperforms state-of-the-art software for H-polytopes by
Cousins-Vempala (2016) and Emiris-Fisikopoulos (2018), while it undertakes
volume computations that were intractable until now, as it is the first
polynomial-time, practical method for V-polytopes and zonotopes that scales to
high dimensions (currently 100). We further focus on zonotopes, and
characterize them by their order (number of generators over dimension), because
this largely determines sampling complexity. We analyze a related application,
where we evaluate methods of zonotope approximation in engineering.Comment: 20 pages, 12 figures, 3 table
A new Lenstra-type Algorithm for Quasiconvex Polynomial Integer Minimization with Complexity 2^O(n log n)
We study the integer minimization of a quasiconvex polynomial with
quasiconvex polynomial constraints. We propose a new algorithm that is an
improvement upon the best known algorithm due to Heinz (Journal of Complexity,
2005). This improvement is achieved by applying a new modern Lenstra-type
algorithm, finding optimal ellipsoid roundings, and considering sparse
encodings of polynomials. For the bounded case, our algorithm attains a
time-complexity of s (r l M d)^{O(1)} 2^{2n log_2(n) + O(n)} when M is a bound
on the number of monomials in each polynomial and r is the binary encoding
length of a bound on the feasible region. In the general case, s l^{O(1)}
d^{O(n)} 2^{2n log_2(n) +O(n)}. In each we assume d>= 2 is a bound on the total
degree of the polynomials and l bounds the maximum binary encoding size of the
input.Comment: 28 pages, 10 figure
Practical Volume Computation of Structured Convex Bodies, and an Application to Modeling Portfolio Dependencies and Financial Crises
We examine volume computation of general-dimensional polytopes and more general convex bodies, defined as the intersection of a simplex by a family of parallel hyperplanes, and another family of parallel hyperplanes or a family of concentric ellipsoids. Such convex bodies appear in modeling and predicting financial crises. The impact of crises on the economy (labor, income, etc.) makes its detection of prime interest for the public in general and for policy makers in particular. Certain features of dependencies in the markets clearly identify times of turmoil. We describe the relationship between asset characteristics by means of a copula; each characteristic is either a linear or quadratic form of the portfolio components, hence the copula can be constructed by computing volumes of convex bodies.
We design and implement practical algorithms in the exact and approximate setting, we experimentally juxtapose them and study the tradeoff of exactness and accuracy for speed. We analyze the following methods in order of increasing generality: rejection sampling relying on uniformly sampling the simplex, which is the fastest approach, but inaccurate for small volumes; exact formulae based on the computation of integrals of probability distribution functions, which are the method of choice for intersections with a single hyperplane; an optimized Lawrence sign decomposition method, since the polytopes at hand are shown to be simple with additional structure; Markov chain Monte Carlo algorithms using random walks based on the hit-and-run paradigm generalized to nonlinear convex bodies and relying on new methods for computing a ball enclosed in the given body, such as a second-order cone program; the latter is experimentally extended to non-convex bodies with very encouraging results. Our C++ software, based on CGAL and Eigen and available on github, is shown to be very effective in up to 100 dimensions. Our results offer novel, effective means of computing portfolio dependencies and an indicator of financial crises, which is shown to correctly identify past crises
Uniform sampling of steady states in metabolic networks: heterogeneous scales and rounding
The uniform sampling of convex polytopes is an interesting computational
problem with many applications in inference from linear constraints, but the
performances of sampling algorithms can be affected by ill-conditioning. This
is the case of inferring the feasible steady states in models of metabolic
networks, since they can show heterogeneous time scales . In this work we focus
on rounding procedures based on building an ellipsoid that closely matches the
sampling space, that can be used to define an efficient hit-and-run (HR) Markov
Chain Monte Carlo. In this way the uniformity of the sampling of the convex
space of interest is rigorously guaranteed, at odds with non markovian methods.
We analyze and compare three rounding methods in order to sample the feasible
steady states of metabolic networks of three models of growing size up to
genomic scale. The first is based on principal component analysis (PCA), the
second on linear programming (LP) and finally we employ the lovasz ellipsoid
method (LEM). Our results show that a rounding procedure is mandatory for the
application of the HR in these inference problem and suggest that a combination
of LEM or LP with a subsequent PCA perform the best. We finally compare the
distributions of the HR with that of two heuristics based on the Artificially
Centered hit-and-run (ACHR), gpSampler and optGpSampler. They show a good
agreement with the results of the HR for the small network, while on genome
scale models present inconsistencies.Comment: Replacement with major revision
Recommended from our members
Discrete Differential Geometry
This is the collection of extended abstracts for the 26 lectures and the open problem session at the fourth Oberwolfach workshop on Discrete Differential Geometry
Practical Volume Computation of Structured Convex Bodies for Modeling Financial Crises
Στην παρούσα διπλωματική εργασία αναπτύσουμε και βελτιώνουμε μεθόδους και αλγορίθμους για την αποτελεσματική επίλυση προβλημάτων υπολογισμού όγκου. Τα προβλήματα τα οποία επιλύουμε προκύπτουν από μία εφαρμογή της οικονομικής επιστήμης που αφορά την κατασκευή ενός μαθηματικού μοντέλου πρόβλεψης χρηματιστηριακών κρίσεων. Τα προβλήματα υπολογισμού όγκου προκύπτουν από την τομή δύο οικογενειών παράλληλων υπερεπιπέδων ή από μια οικογένεια πράλληλων υπερεπιπέδων και μια οικογένεια από ομόκεντρες ελλείψεις με το μοναδιαίο άπλοκο σε αυθαίρετη διάσταση. Επομένως ζητείται ο όγκος απλών πολυτόπων, αλλά και κυρτών ή μη κυρτών μη γραμμικών σωμάτων.We examine volume computation of general-dimensional polytopes and more general convex bodies, defined as the intersection of a simplex by a family of parallel hyperplanes, and another family of parallel hyperplanes or a family of concentric ellipsoids. Such convex bodies appear in modeling and predicting financial crises.
We design and implement practical algorithms in the exact and approximate setting, we experimentally juxtapose them and study the tradeoff of exactness and accuracy for speed
- …