1 research outputs found

    Integrating Visual Information Across Camera Movements with a Visual-Motor Calibration Map

    No full text
    Facing the competing demands for wider field of view and higher spatial resolution, computer vision will evolve toward greater use of foveal sensors and frequent camera movements. Integration of visual information across movements becomes a fundamental problem. We show that integration is possible using a biologically-inspired representation we call the visual-motor calibration map. The map is a memory-based model of the relationship between camera movements and corresponding pixel locations before and after any movement. The map constitutes a selfcalibration that can compensate for non-uniform sampling, lens distortion, mechanical misalignments, and arbitrary pixel reordering. Integration takes place entirely in a retinotopic frame, using a short-term, predictive visual memory. Introduction The competing demands for wider field of view and higher spatial resolution suggest that computer vision systems will inevitably progress towards the tradeoff that evolution selected for animal v..
    corecore