2,542 research outputs found
Rolling Shutter Stereo
A huge fraction of cameras used nowadays is based on
CMOS sensors with a rolling shutter that exposes the image
line by line. For dynamic scenes/cameras this introduces
undesired effects like stretch, shear and wobble. It has been
shown earlier that rotational shake induced rolling shutter
effects in hand-held cell phone capture can be compensated
based on an estimate of the camera rotation. In contrast, we
analyse the case of significant camera motion, e.g. where
a bypassing streetlevel capture vehicle uses a rolling shutter
camera in a 3D reconstruction framework. The introduced
error is depth dependent and cannot be compensated
based on camera motion/rotation alone, invalidating also
rectification for stereo camera systems. On top, significant
lens distortion as often present in wide angle cameras intertwines
with rolling shutter effects as it changes the time
at which a certain 3D point is seen. We show that naive
3D reconstructions (assuming global shutter) will deliver
biased geometry already for very mild assumptions on vehicle
speed and resolution. We then develop rolling shutter
dense multiview stereo algorithms that solve for time of exposure
and depth at the same time, even in the presence of
lens distortion and perform an evaluation on ground truth
laser scan models as well as on real street-level data
Wireless Software Synchronization of Multiple Distributed Cameras
We present a method for precisely time-synchronizing the capture of image
sequences from a collection of smartphone cameras connected over WiFi. Our
method is entirely software-based, has only modest hardware requirements, and
achieves an accuracy of less than 250 microseconds on unmodified commodity
hardware. It does not use image content and synchronizes cameras prior to
capture. The algorithm operates in two stages. In the first stage, we designate
one device as the leader and synchronize each client device's clock to it by
estimating network delay. Once clocks are synchronized, the second stage
initiates continuous image streaming, estimates the relative phase of image
timestamps between each client and the leader, and shifts the streams into
alignment. We quantitatively validate our results on a multi-camera rig imaging
a high-precision LED array and qualitatively demonstrate significant
improvements to multi-view stereo depth estimation and stitching of dynamic
scenes. We release as open source 'libsoftwaresync', an Android implementation
of our system, to inspire new types of collective capture applications.Comment: Main: 9 pages, 10 figures. Supplemental: 3 pages, 5 figure
- …