6,167 research outputs found
Unified Computation of Strict Maximum Likelihood for Geometric Fitting
A new numerical scheme is presented for computing strict maximum likelihood (ML) of geometric
fitting problems having an implicit constraint. Our approach is orthogonal projection of observations
onto a parameterized surface defined by the constraint. Assuming a linearly separable nonlinear constraint, we show that a theoretically global solution can be obtained by iterative Sampson error minimization. Our approach is illustrated by ellipse fitting and fundamental matrix computation. Our method also encompasses optimal correction, computing, e.g., perpendiculars to an ellipse and triangulating stereo images. A detailed discussion is given to technical and practical issues about our approach
Objective Improvement in Information-Geometric Optimization
Information-Geometric Optimization (IGO) is a unified framework of stochastic
algorithms for optimization problems. Given a family of probability
distributions, IGO turns the original optimization problem into a new
maximization problem on the parameter space of the probability distributions.
IGO updates the parameter of the probability distribution along the natural
gradient, taken with respect to the Fisher metric on the parameter manifold,
aiming at maximizing an adaptive transform of the objective function. IGO
recovers several known algorithms as particular instances: for the family of
Bernoulli distributions IGO recovers PBIL, for the family of Gaussian
distributions the pure rank-mu CMA-ES update is recovered, and for exponential
families in expectation parametrization the cross-entropy/ML method is
recovered. This article provides a theoretical justification for the IGO
framework, by proving that any step size not greater than 1 guarantees monotone
improvement over the course of optimization, in terms of q-quantile values of
the objective function f. The range of admissible step sizes is independent of
f and its domain. We extend the result to cover the case of different step
sizes for blocks of the parameters in the IGO algorithm. Moreover, we prove
that expected fitness improves over time when fitness-proportional selection is
applied, in which case the RPP algorithm is recovered
DBI models for the unification of dark matter and dark energy
We propose a model based on a DBI action for the unification of dark matter
and dark energy. This is supported by the results of the study of its
background behavior at early and late times, and reinforced by the analysis of
the evolution of perturbations. We also perform a Bayesian analysis to set
observational constraints on the parameters of the model using type Ia SN, CMB
shift and BAO data. Finally, to complete the study we investigate its
kinematics aspects, such as the effective equation of state parameter,
acceleration parameter and transition redshift. Particularizing those
parameters for the best fit one appreciates that an effective phantom is
preferred.Comment: 11 pages, 8 figures, revtex, new reference
Most Likely Transformations
We propose and study properties of maximum likelihood estimators in the class
of conditional transformation models. Based on a suitable explicit
parameterisation of the unconditional or conditional transformation function,
we establish a cascade of increasingly complex transformation models that can
be estimated, compared and analysed in the maximum likelihood framework. Models
for the unconditional or conditional distribution function of any univariate
response variable can be set-up and estimated in the same theoretical and
computational framework simply by choosing an appropriate transformation
function and parameterisation thereof. The ability to evaluate the distribution
function directly allows us to estimate models based on the exact likelihood,
especially in the presence of random censoring or truncation. For discrete and
continuous responses, we establish the asymptotic normality of the proposed
estimators. A reference software implementation of maximum likelihood-based
estimation for conditional transformation models allowing the same flexibility
as the theory developed here was employed to illustrate the wide range of
possible applications.Comment: Accepted for publication by the Scandinavian Journal of Statistics
2017-06-1
Recognising Multidimensional Euclidean Preferences
Euclidean preferences are a widely studied preference model, in which
decision makers and alternatives are embedded in d-dimensional Euclidean space.
Decision makers prefer those alternatives closer to them. This model, also
known as multidimensional unfolding, has applications in economics,
psychometrics, marketing, and many other fields. We study the problem of
deciding whether a given preference profile is d-Euclidean. For the
one-dimensional case, polynomial-time algorithms are known. We show that, in
contrast, for every other fixed dimension d > 1, the recognition problem is
equivalent to the existential theory of the reals (ETR), and so in particular
NP-hard. We further show that some Euclidean preference profiles require
exponentially many bits in order to specify any Euclidean embedding, and prove
that the domain of d-Euclidean preferences does not admit a finite forbidden
minor characterisation for any d > 1. We also study dichotomous preferencesand
the behaviour of other metrics, and survey a variety of related work.Comment: 17 page
- …