An Event-Based Solution to the Perspective-n-Point Problem

Abstract

The goal of the Perspective-n-Point problem (PnP) is to find the relative pose between an object and a camera from a set of emph{n} pairings between 3D points and their corresponding 2D projections on the focal plane. Current state of the art solutions, designed to operate on images, rely on computationally expensive minimization techniques. For the first time, this work introduces an event-based Pemph{n}P algorithm designed to work on the output of a neuromorphic event-based vision sensor. The problem is formulated here as a least-squares minimization problem, where the error function is updated with every incoming event. The optimal translation is then computed in closed form, while the desired rotation is given by the evolution of a virtual mechanical system whose energy is proven to be equal to the error function. This allows for a simple yet robust solution of the problem, showing how event-based vision can simplify computer vision tasks. The approach takes full advantage of the high temporal resolution of the sensor, as the estimated pose is incrementally updated with every incoming event. Two approaches are proposed: the Full and the Efficient methods. These two methods are compared against a state of the art Pemph{n}P algorithm both on synthetic and on real data, producing similar accuracy in addition of being faster

Similar works

Full text

thumbnail-image

Directory of Open Access Journals

redirect
Last time updated on 09/08/2016

This paper was published in Directory of Open Access Journals.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.