53,181 research outputs found
The complexity of tangent words
In a previous paper, we described the set of words that appear in the coding
of smooth (resp. analytic) curves at arbitrary small scale. The aim of this
paper is to compute the complexity of those languages.Comment: In Proceedings WORDS 2011, arXiv:1108.341
Manifold Optimization Over the Set of Doubly Stochastic Matrices: A Second-Order Geometry
Convex optimization is a well-established research area with applications in
almost all fields. Over the decades, multiple approaches have been proposed to
solve convex programs. The development of interior-point methods allowed
solving a more general set of convex programs known as semi-definite programs
and second-order cone programs. However, it has been established that these
methods are excessively slow for high dimensions, i.e., they suffer from the
curse of dimensionality. On the other hand, optimization algorithms on manifold
have shown great ability in finding solutions to nonconvex problems in
reasonable time. This paper is interested in solving a subset of convex
optimization using a different approach. The main idea behind Riemannian
optimization is to view the constrained optimization problem as an
unconstrained one over a restricted search space. The paper introduces three
manifolds to solve convex programs under particular box constraints. The
manifolds, called the doubly stochastic, symmetric and the definite multinomial
manifolds, generalize the simplex also known as the multinomial manifold. The
proposed manifolds and algorithms are well-adapted to solving convex programs
in which the variable of interest is a multidimensional probability
distribution function. Theoretical analysis and simulation results testify the
efficiency of the proposed method over state of the art methods. In particular,
they reveal that the proposed framework outperforms conventional generic and
specialized solvers, especially in high dimensions
A Tutorial on Fisher Information
In many statistical applications that concern mathematical psychologists, the
concept of Fisher information plays an important role. In this tutorial we
clarify the concept of Fisher information as it manifests itself across three
different statistical paradigms. First, in the frequentist paradigm, Fisher
information is used to construct hypothesis tests and confidence intervals
using maximum likelihood estimators; second, in the Bayesian paradigm, Fisher
information is used to define a default prior; lastly, in the minimum
description length paradigm, Fisher information is used to measure model
complexity
Computing the vertices of tropical polyhedra using directed hypergraphs
We establish a characterization of the vertices of a tropical polyhedron
defined as the intersection of finitely many half-spaces. We show that a point
is a vertex if, and only if, a directed hypergraph, constructed from the
subdifferentials of the active constraints at this point, admits a unique
strongly connected component that is maximal with respect to the reachability
relation (all the other strongly connected components have access to it). This
property can be checked in almost linear-time. This allows us to develop a
tropical analogue of the classical double description method, which computes a
minimal internal representation (in terms of vertices) of a polyhedron defined
externally (by half-spaces or hyperplanes). We provide theoretical worst case
complexity bounds and report extensive experimental tests performed using the
library TPLib, showing that this method outperforms the other existing
approaches.Comment: 29 pages (A4), 10 figures, 1 table; v2: Improved algorithm in section
5 (using directed hypergraphs), detailed appendix; v3: major revision of the
article (adding tropical hyperplanes, alternative method by arrangements,
etc); v4: minor revisio
Tropicalizing the simplex algorithm
We develop a tropical analog of the simplex algorithm for linear programming.
In particular, we obtain a combinatorial algorithm to perform one tropical
pivoting step, including the computation of reduced costs, in O(n(m+n)) time,
where m is the number of constraints and n is the dimension.Comment: v1: 35 pages, 7 figures, 4 algorithms; v2: improved presentation, 39
pages, 9 figures, 4 algorithm
- …