25,597 research outputs found
Keyframe-based monocular SLAM: design, survey, and future directions
Extensive research in the field of monocular SLAM for the past fifteen years
has yielded workable systems that found their way into various applications in
robotics and augmented reality. Although filter-based monocular SLAM systems
were common at some time, the more efficient keyframe-based solutions are
becoming the de facto methodology for building a monocular SLAM system. The
objective of this paper is threefold: first, the paper serves as a guideline
for people seeking to design their own monocular SLAM according to specific
environmental constraints. Second, it presents a survey that covers the various
keyframe-based monocular SLAM systems in the literature, detailing the
components of their implementation, and critically assessing the specific
strategies made in each proposed solution. Third, the paper provides insight
into the direction of future research in this field, to address the major
limitations still facing monocular SLAM; namely, in the issues of illumination
changes, initialization, highly dynamic motion, poorly textured scenes,
repetitive textures, map maintenance, and failure recovery
Non-linear Kalman filters for calibration in radio interferometry
We present a new calibration scheme based on a non-linear version of Kalman
filter that aims at estimating the physical terms appearing in the Radio
Interferometry Measurement Equation (RIME). We enrich the filter's structure
with a tunable data representation model, together with an augmented
measurement model for regularization. We show using simulations that it can
properly estimate the physical effects appearing in the RIME. We found that
this approach is particularly useful in the most extreme cases such as when
ionospheric and clock effects are simultaneously present. Combined with the
ability to provide prior knowledge on the expected structure of the physical
instrumental effects (expected physical state and dynamics), we obtain a fairly
cheap algorithm that we believe to be robust, especially in low signal-to-noise
regime. Potentially the use of filters and other similar methods can represent
an improvement for calibration in radio interferometry, under the condition
that the effects corrupting visibilities are understood and analytically
stable. Recursive algorithms are particularly well adapted for pre-calibration
and sky model estimate in a streaming way. This may be useful for the SKA-type
instruments that produce huge amounts of data that have to be calibrated before
being averaged
Robust Estimation and Wavelet Thresholding in Partial Linear Models
This paper is concerned with a semiparametric partially linear regression
model with unknown regression coefficients, an unknown nonparametric function
for the non-linear component, and unobservable Gaussian distributed random
errors. We present a wavelet thresholding based estimation procedure to
estimate the components of the partial linear model by establishing a
connection between an -penalty based wavelet estimator of the
nonparametric component and Huber's M-estimation of a standard linear model
with outliers. Some general results on the large sample properties of the
estimates of both the parametric and the nonparametric part of the model are
established. Simulations and a real example are used to illustrate the general
results and to compare the proposed methodology with other methods available in
the recent literature
Optimal Dark Hole Generation via Two Deformable Mirrors with Stroke Minimization
The past decade has seen a significant growth in research targeted at space
based observatories for imaging exo-solar planets. The challenge is in
designing an imaging system for high-contrast. Even with a perfect coronagraph
that modifies the point spread function to achieve high-contrast, wavefront
sensing and control is needed to correct the errors in the optics and generate
a "dark hole". The high-contrast imaging laboratory at Princeton University is
equipped with two Boston Micromachines Kilo-DMs. We review here an algorithm
designed to achieve high-contrast on both sides of the image plane while
minimizing the stroke necessary from each deformable mirror (DM). This
algorithm uses the first DM to correct for amplitude aberrations and the second
DM to create a flat wavefront in the pupil plane. We then show the first
results obtained at Princeton with this correction algorithm, and we
demonstrate a symmetric dark hole in monochromatic light
LiDAR-derived digital holograms for automotive head-up displays.
A holographic automotive head-up display was developed to project 2D and 3D ultra-high definition (UHD) images using LiDAR data in the driver's field of view. The LiDAR data was collected with a 3D terrestrial laser scanner and was converted to computer-generated holograms (CGHs). The reconstructions were obtained with a HeNe laser and a UHD spatial light modulator with a panel resolution of 3840×2160 px for replay field projections. By decreasing the focal distance of the CGHs, the zero-order spot was diffused into the holographic replay field image. 3D holograms were observed floating as a ghost image at a variable focal distance with a digital Fresnel lens into the CGH and a concave lens.This project was funded by the EPSRC Centre for Doctoral Training in Connected Electronic and Photonic Systems (CEPS) (EP/S022139/1), Project Reference: 2249444
Bayesian Cosmic Web Reconstruction: BARCODE for Clusters
We describe the Bayesian BARCODE formalism that has been designed towards the
reconstruction of the Cosmic Web in a given volume on the basis of the sampled
galaxy cluster distribution. Based on the realization that the massive compact
clusters are responsible for the major share of the large scale tidal force
field shaping the anisotropic and in particular filamentary features in the
Cosmic Web. Given the nonlinearity of the constraints imposed by the cluster
configurations, we resort to a state-of-the-art constrained reconstruction
technique to find a proper statistically sampled realization of the original
initial density and velocity field in the same cosmic region. Ultimately, the
subsequent gravitational evolution of these initial conditions towards the
implied Cosmic Web configuration can be followed on the basis of a proper
analytical model or an N-body computer simulation. The BARCODE formalism
includes an implicit treatment for redshift space distortions. This enables a
direct reconstruction on the basis of observational data, without the need for
a correction of redshift space artifacts. In this contribution we provide a
general overview of the the Cosmic Web connection with clusters and a
description of the Bayesian BARCODE formalism. We conclude with a presentation
of its successful workings with respect to test runs based on a simulated large
scale matter distribution, in physical space as well as in redshift space.Comment: 18 pages, 8 figures, Proceedings of IAU Symposium 308 "The Zeldovich
Universe: Genesis and Growth of the Cosmic Web", 23-28 June 2014, Tallinn,
Estoni
- …