33,295 research outputs found
Calibration by correlation using metric embedding from non-metric similarities
This paper presents a new intrinsic calibration method that allows us to calibrate a generic single-view point camera just
by waving it around. From the video sequence obtained while the camera undergoes random motion, we compute the pairwise time
correlation of the luminance signal for a subset of the pixels. We show that, if the camera undergoes a random uniform motion, then
the pairwise correlation of any pixels pair is a function of the distance between the pixel directions on the visual sphere. This leads to
formalizing calibration as a problem of metric embedding from non-metric measurements: we want to find the disposition of pixels on
the visual sphere from similarities that are an unknown function of the distances. This problem is a generalization of multidimensional
scaling (MDS) that has so far resisted a comprehensive observability analysis (can we reconstruct a metrically accurate embedding?)
and a solid generic solution (how to do so?). We show that the observability depends both on the local geometric properties (curvature)
as well as on the global topological properties (connectedness) of the target manifold. We show that, in contrast to the Euclidean case,
on the sphere we can recover the scale of the points distribution, therefore obtaining a metrically accurate solution from non-metric
measurements. We describe an algorithm that is robust across manifolds and can recover a metrically accurate solution when the metric
information is observable. We demonstrate the performance of the algorithm for several cameras (pin-hole, fish-eye, omnidirectional),
and we obtain results comparable to calibration using classical methods. Additional synthetic benchmarks show that the algorithm
performs as theoretically predicted for all corner cases of the observability analysis
A practical multirobot localization system
We present a fast and precise vision-based software intended for multiple robot localization. The core component of the software is a novel and efficient algorithm for black and white pattern detection. The method is robust to variable lighting conditions, achieves sub-pixel precision and its computational complexity is independent of the processed image size. With off-the-shelf computational equipment and low-cost cameras, the core algorithm is able to process hundreds of images per second while tracking hundreds of objects with a millimeter precision. In addition, we present the method's mathematical model, which allows to estimate the expected localization precision, area of coverage, and processing speed from the camera's intrinsic parameters and hardware's processing capacity. The correctness of the presented model and performance of the algorithm in real-world conditions is verified in several experiments. Apart from the method description, we also make its source code public at \emph{http://purl.org/robotics/whycon}; so, it can be used as an enabling technology for various mobile robotic problems
Locally Non-rigid Registration for Mobile HDR Photography
Image registration for stack-based HDR photography is challenging. If not
properly accounted for, camera motion and scene changes result in artifacts in
the composite image. Unfortunately, existing methods to address this problem
are either accurate, but too slow for mobile devices, or fast, but prone to
failing. We propose a method that fills this void: our approach is extremely
fast---under 700ms on a commercial tablet for a pair of 5MP images---and
prevents the artifacts that arise from insufficient registration quality
Detection of brown dwarfs by the micro-lensing of unresolved stars
The presence of brown dwarfs in the dark galactic halo could be detected
through their gravitational lensing effect and experiments under way monitor
about one million stars to observe a few lensing events per year. We show that
if the photon flux from a galaxy is measured with a good precision, it is not
necessary to resolve the stars and besides more events could be observed.Comment: 14 p., LaTeX, 4 figures available on request, PAR-LPTHE 92 39/LPC 92
1
Particle Detection Algorithms for Complex Plasmas
In complex plasmas, the behavior of freely floating micrometer sized
particles is studied. The particles can be directly visualized and recorded by
digital video cameras. To analyze the dynamics of single particles, reliable
algorithms are required to accurately determine their positions to sub-pixel
accuracy from the recorded images. Typically, straightforward algorithms are
used for this task. Here, we combine the algorithms with common techniques for
image processing. We study several algorithms and pre- and post-processing
methods, and we investigate the impact of the choice of threshold parameters,
including an automatic threshold detection. The results quantitatively show
that each algorithm and method has its own advantage, often depending on the
problem at hand. This knowledge is applicable not only to complex plasmas, but
useful for any kind of comparable image-based particle tracking, e.g. in the
field of colloids or granular matter
- âŠ