24,921 research outputs found

    Frame Combination Techniques for Ultra High-Contrast Imaging

    Full text link
    We summarize here an experimental frame combination pipeline we developed for ultra high-contrast imaging with systems like the upcoming VLT SPHERE instrument. The pipeline combines strategies from the Drizzle technique, the Spitzer IRACproc package, and homegrown codes, to combine image sets that may include a rotating field of view and arbitrary shifts between frames. The pipeline is meant to be robust at dealing with data that may contain non-ideal effects like sub-pixel pointing errors, missing data points, non-symmetrical noise sources, arbitrary geometric distortions, and rapidly changing point spread functions. We summarize in this document individual steps and strategies, as well as results from preliminary tests and simulations.Comment: 9 pages, 4 figures, SPIE conference pape

    Theory, Design, and Implementation of Landmark Promotion Cooperative Simultaneous Localization and Mapping

    Get PDF
    Simultaneous Localization and Mapping (SLAM) is a challenging problem in practice, the use of multiple robots and inexpensive sensors poses even more demands on the designer. Cooperative SLAM poses specific challenges in the areas of computational efficiency, software/network performance, and robustness to errors. New methods in image processing, recursive filtering, and SLAM have been developed to implement practical algorithms for cooperative SLAM on a set of inexpensive robots. The Consolidated Unscented Mixed Recursive Filter (CUMRF) is designed to handle non-linear systems with non-Gaussian noise. This is accomplished using the Unscented Transform combined with Gaussian Mixture Models. The Robust Kalman Filter is an extension of the Kalman Filter algorithm that improves the ability to remove erroneous observations using Principal Component Analysis (PCA) and the X84 outlier rejection rule. Forgetful SLAM is a local SLAM technique that runs in nearly constant time relative to the number of visible landmarks and improves poor performing sensors through sensor fusion and outlier rejection. Forgetful SLAM correlates all measured observations, but stops the state from growing over time. Hierarchical Active Ripple SLAM (HAR-SLAM) is a new SLAM architecture that breaks the traditional state space of SLAM into a chain of smaller state spaces, allowing multiple robots, multiple sensors, and multiple updates to occur in linear time with linear storage with respect to the number of robots, landmarks, and robots poses. This dissertation presents explicit methods for closing-the-loop, joining multiple robots, and active updates. Landmark Promotion SLAM is a hierarchy of new SLAM methods, using the Robust Kalman Filter, Forgetful SLAM, and HAR-SLAM. Practical aspects of SLAM are a focus of this dissertation. LK-SURF is a new image processing technique that combines Lucas-Kanade feature tracking with Speeded-Up Robust Features to perform spatial and temporal tracking. Typical stereo correspondence techniques fail at providing descriptors for features, or fail at temporal tracking. Several calibration and modeling techniques are also covered, including calibrating stereo cameras, aligning stereo cameras to an inertial system, and making neural net system models. These methods are important to improve the quality of the data and images acquired for the SLAM process

    Robust Chauvenet Outlier Rejection

    Full text link
    Sigma clipping is commonly used in astronomy for outlier rejection, but the number of standard deviations beyond which one should clip data from a sample ultimately depends on the size of the sample. Chauvenet rejection is one of the oldest, and simplest, ways to account for this, but, like sigma clipping, depends on the sample's mean and standard deviation, neither of which are robust quantities: Both are easily contaminated by the very outliers they are being used to reject. Many, more robust measures of central tendency, and of sample deviation, exist, but each has a tradeoff with precision. Here, we demonstrate that outlier rejection can be both very robust and very precise if decreasingly robust but increasingly precise techniques are applied in sequence. To this end, we present a variation on Chauvenet rejection that we call "robust" Chauvenet rejection (RCR), which uses three decreasingly robust/increasingly precise measures of central tendency, and four decreasingly robust/increasingly precise measures of sample deviation. We show this sequential approach to be very effective for a wide variety of contaminant types, even when a significant -- even dominant -- fraction of the sample is contaminated, and especially when the contaminants are strong. Furthermore, we have developed a bulk-rejection variant, to significantly decrease computing times, and RCR can be applied both to weighted data, and when fitting parameterized models to data. We present aperture photometry in a contaminated, crowded field as an example. RCR may be used by anyone at https://skynet.unc.edu/rcr, and source code is available there as well.Comment: 62 pages, 48 figures, 7 tables, accepted for publication in ApJ

    Online Bivariate Outlier Detection in Final Test Using Kernel Density Estimation

    Get PDF
    In parametric IC testing, outlier detection is applied to filter out potential unreliable devices. Most outlier detection methods are used in an offline setting and hence are not applicable to Final Test, where immediate pass/fail decisions are required. Therefore, we developed a new bivariate online outlier detection method that is applicable to Final Test without making assumptions about a specific form of relations between two test parameters. An acceptance region is constructed using kernel density estimation. We use a grid discretization in order to enable a fast outlier decision. After each accepted device the grid is updated, hence the method is able to adapt to shifting measurements

    Implementation of robust image artifact removal in SWarp through clipped mean stacking

    Full text link
    We implement an algorithm for detecting and removing artifacts from astronomical images by means of outlier rejection during stacking. Our method is capable of addressing both small, highly significant artifacts such as cosmic rays and, by applying a filtering technique to generate single frame masks, larger area but lower surface brightness features such as secondary (ghost) images of bright stars. In contrast to the common method of building a median stack, the clipped or outlier-filtered mean stacked point-spread function (PSF) is a linear combination of the single frame PSFs as long as the latter are moderately homogeneous, a property of great importance for weak lensing shape measurement or model fitting photometry. In addition, it has superior noise properties, allowing a significant reduction in exposure time compared to median stacking. We make publicly available a modified version of SWarp that implements clipped mean stacking and software to generate single frame masks from the list of outlier pixels.Comment: PASP accepted; software for download at http://www.usm.uni-muenchen.de/~dgruen

    Learning and Matching Multi-View Descriptors for Registration of Point Clouds

    Full text link
    Critical to the registration of point clouds is the establishment of a set of accurate correspondences between points in 3D space. The correspondence problem is generally addressed by the design of discriminative 3D local descriptors on the one hand, and the development of robust matching strategies on the other hand. In this work, we first propose a multi-view local descriptor, which is learned from the images of multiple views, for the description of 3D keypoints. Then, we develop a robust matching approach, aiming at rejecting outlier matches based on the efficient inference via belief propagation on the defined graphical model. We have demonstrated the boost of our approaches to registration on the public scanning and multi-view stereo datasets. The superior performance has been verified by the intensive comparisons against a variety of descriptors and matching methods

    Modeling Perceptual Aliasing in SLAM via Discrete-Continuous Graphical Models

    Full text link
    Perceptual aliasing is one of the main causes of failure for Simultaneous Localization and Mapping (SLAM) systems operating in the wild. Perceptual aliasing is the phenomenon where different places generate a similar visual (or, in general, perceptual) footprint. This causes spurious measurements to be fed to the SLAM estimator, which typically results in incorrect localization and mapping results. The problem is exacerbated by the fact that those outliers are highly correlated, in the sense that perceptual aliasing creates a large number of mutually-consistent outliers. Another issue stems from the fact that most state-of-the-art techniques rely on a given trajectory guess (e.g., from odometry) to discern between inliers and outliers and this makes the resulting pipeline brittle, since the accumulation of error may result in incorrect choices and recovery from failures is far from trivial. This work provides a unified framework to model perceptual aliasing in SLAM and provides practical algorithms that can cope with outliers without relying on any initial guess. We present two main contributions. The first is a Discrete-Continuous Graphical Model (DC-GM) for SLAM: the continuous portion of the DC-GM captures the standard SLAM problem, while the discrete portion describes the selection of the outliers and models their correlation. The second contribution is a semidefinite relaxation to perform inference in the DC-GM that returns estimates with provable sub-optimality guarantees. Experimental results on standard benchmarking datasets show that the proposed technique compares favorably with state-of-the-art methods while not relying on an initial guess for optimization.Comment: 13 pages, 14 figures, 1 tabl
    • 

    corecore