3 research outputs found

    Sensor Integration Using State Estimators

    Full text link

    Computing Position and Orientation of a Freeflying Polyhedron from 3D

    No full text
    A robotic vision system for grasping of a free-flying polyhedron in space is developed using stereo vision and laser range finders. Real time motion estimation and sensor fusion is achieved by using a priori knowledge of the object. A Maximum Likelihood parameter estimator is developed for rotational symmetric polyhedrons, and the 3D transformations for fusing many different sensors into one coordinate frame are given. The minimization is solved using the Sequential Quadratic Programming technique, which has proved to be a robust and efficient method to solve the above problem. Included are simulation results performed on the hardware that will be used in the ROTEX experiment
    corecore