356,050 research outputs found
Accurate and linear time pose estimation from points and lines
The final publication is available at link.springer.comThe Perspective-n-Point (PnP) problem seeks to estimate the pose of a calibrated camera from n 3Dto-2D point correspondences. There are situations, though, where PnP solutions are prone to fail because feature point correspondences cannot be reliably estimated (e.g. scenes with repetitive patterns or with low texture). In such
scenarios, one can still exploit alternative geometric entities, such as lines, yielding the so-called Perspective-n-Line (PnL) algorithms. Unfortunately, existing PnL solutions are not as accurate and efficient as their point-based
counterparts. In this paper we propose a novel approach to introduce 3D-to-2D line correspondences into a PnP formulation, allowing to simultaneously process points and lines. For this purpose we introduce an algebraic line error
that can be formulated as linear constraints on the line endpoints, even when these are not directly observable. These constraints can then be naturally integrated within the linear formulations of two state-of-the-art point-based algorithms,
the OPnP and the EPnP, allowing them to indistinctly handle points, lines, or a combination of them. Exhaustive experiments show that the proposed formulation brings remarkable boost in performance compared to only point or
only line based solutions, with a negligible computational overhead compared to the original OPnP and EPnP.Peer ReviewedPostprint (author's final draft
Network-based ranking in social systems: three challenges
Ranking algorithms are pervasive in our increasingly digitized societies,
with important real-world applications including recommender systems, search
engines, and influencer marketing practices. From a network science
perspective, network-based ranking algorithms solve fundamental problems
related to the identification of vital nodes for the stability and dynamics of
a complex system. Despite the ubiquitous and successful applications of these
algorithms, we argue that our understanding of their performance and their
applications to real-world problems face three fundamental challenges: (i)
Rankings might be biased by various factors; (2) their effectiveness might be
limited to specific problems; and (3) agents' decisions driven by rankings
might result in potentially vicious feedback mechanisms and unhealthy systemic
consequences. Methods rooted in network science and agent-based modeling can
help us to understand and overcome these challenges.Comment: Perspective article. 9 pages, 3 figure
MLPnP - A Real-Time Maximum Likelihood Solution to the Perspective-n-Point Problem
In this paper, a statistically optimal solution to the Perspective-n-Point
(PnP) problem is presented. Many solutions to the PnP problem are geometrically
optimal, but do not consider the uncertainties of the observations. In
addition, it would be desirable to have an internal estimation of the accuracy
of the estimated rotation and translation parameters of the camera pose. Thus,
we propose a novel maximum likelihood solution to the PnP problem, that
incorporates image observation uncertainties and remains real-time capable at
the same time. Further, the presented method is general, as is works with 3D
direction vectors instead of 2D image points and is thus able to cope with
arbitrary central camera models. This is achieved by projecting (and thus
reducing) the covariance matrices of the observations to the corresponding
vector tangent space.Comment: Submitted to the ISPRS congress (2016) in Prague. Oral Presentation.
Published in ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., III-3,
131-13
Approximate Computation and Implicit Regularization for Very Large-scale Data Analysis
Database theory and database practice are typically the domain of computer
scientists who adopt what may be termed an algorithmic perspective on their
data. This perspective is very different than the more statistical perspective
adopted by statisticians, scientific computers, machine learners, and other who
work on what may be broadly termed statistical data analysis. In this article,
I will address fundamental aspects of this algorithmic-statistical disconnect,
with an eye to bridging the gap between these two very different approaches. A
concept that lies at the heart of this disconnect is that of statistical
regularization, a notion that has to do with how robust is the output of an
algorithm to the noise properties of the input data. Although it is nearly
completely absent from computer science, which historically has taken the input
data as given and modeled algorithms discretely, regularization in one form or
another is central to nearly every application domain that applies algorithms
to noisy data. By using several case studies, I will illustrate, both
theoretically and empirically, the nonobvious fact that approximate
computation, in and of itself, can implicitly lead to statistical
regularization. This and other recent work suggests that, by exploiting in a
more principled way the statistical properties implicit in worst-case
algorithms, one can in many cases satisfy the bicriteria of having algorithms
that are scalable to very large-scale databases and that also have good
inferential or predictive properties.Comment: To appear in the Proceedings of the 2012 ACM Symposium on Principles
of Database Systems (PODS 2012
- …