306 research outputs found

    Precise pose estimation of the NASA Mars 2020 Perseverance rover through a stereo-vision-based approach

    Get PDF
    Visual Odometry (VO) is a fundamental technique to enhance the navigation capabilities of planetary exploration rovers. By processing the images acquired during the motion, VO methods provide estimates of the relative position and attitude between navigation steps with the detection and tracking of two-dimensional (2D) image keypoints. This method allows one to mitigate trajectory inconsistencies associated with slippage conditions resulting from dead-reckoning techniques. We present here an independent analysis of the high-resolution stereo images of the NASA Mars 2020 Perseverance rover to retrieve its accurate localization on sols 65, 66, 72, and 120. The stereo pairs are processed by using a 3D-to-3D stereo-VO approach that is based on consolidated techniques and accounts for the main nonlinear optical effects characterizing real cameras. The algorithm is first validated through the analysis of rectified stereo images acquired by the NASA Mars Exploration Rover Opportunity, and then applied to the determination of Perseverance's path. The results suggest that our reconstructed path is consistent with the telemetered trajectory, which was directly retrieved onboard the rover's system. The estimated pose is in full agreement with the archived rover's position and attitude after short navigation steps. Significant differences (~10โ€“30 cm) between our reconstructed and telemetered trajectories are observed when Perseverance traveled distances larger than 1 m between the acquisition of stereo pairs

    Geometrical Calibration for the Panrover: a Stereo Omnidirectional System for Planetary Rover

    Get PDF
    Abstract. A novel panoramic stereo imaging system is proposed in this paper. The system is able to carry out a 360ยฐ stereoscopic vision, useful for rover autonomous-driving, and capture simultaneously a high-resolution stereo scene. The core of the concept is a novel "bifocal panoramic lens" (BPL) based on hyper hemispheric model (Pernechele et al. 2016). This BPL is able to record a panoramic field of view (FoV) and, simultaneously, an area (belonging to the panoramic FoV) with a given degree of magnification by using a unique image sensor. This strategy makes possible to avoid rotational mechanisms. Using two BPLs settled in a vertical baseline (system called PANROVER) allows the monitoring of the surrounding environment in stereoscopic (3D) mode and, simultaneously, capturing an high-resolution stereoscopic images to analyse scientific cases, making it a new paradigm in the planetary rovers framework.Differently from the majority of the Mars systems which are based on rotational mechanisms for the acquisition of the panoramic images (mosaicked on ground), the PANROVER does not contain any moving components and can rescue a hi-rate stereo images of the context panorama.Scope of this work is the geometric calibration of the panoramic acquisition system by the omnidirectional calibration methods (Scaramuzza et al. 2006) based on Zhang calibration grid. The procedures are applied in order to obtain well rectified synchronized stereo images to be available for 3D reconstruction. We applied a Zhang chess boards based approach even during STC/SIMBIO-SYS stereo camera calibration (Simioni et al. 2014, 2017). In this case the target of the calibration will be the stereo heads (the BPLs) of the PANROVER with the scope of extracting the intrinsic parameters of the optical systems. Differently by previous pipelines, using the same data bench the estimate of the extrinsic parameters is performed

    ๋กœ๋ฒ„ ํ•ญ๋ฒ•์„ ์œ„ํ•œ ์ž๊ฐ€๋ณด์ • ์˜์ƒ๊ด€์„ฑ ์˜ค๋„๋ฉ”ํŠธ๋ฆฌ

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ (์„์‚ฌ)-- ์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› : ๊ณต๊ณผ๋Œ€ํ•™ ๊ธฐ๊ณ„ํ•ญ๊ณต๊ณตํ•™๋ถ€, 2019. 2. ๋ฐ•์ฐฌ๊ตญ.This master's thesis presents a direct visual odometry robust to illumination changes and a self-calibrated visual-inertial odometry for a rover localization using an IMU and a stereo camera. Most of the previous vision-based localization algorithms are vulnerable to sudden brightness changes due to strong sunlight or a variance of the exposure time, that violates Lambertian surface assumption. Meanwhile, to decrease the error accumulation of a visual odometry, an IMU can be employed to fill gaps between successive images. However, extrinsic parameters for a visual-inertial system should be computed precisely since they play an important role in making a bridge between the visual and inertial coordinate frames, spatially as well as temporally. This thesis proposes a bucketed illumination model to account for partial and global illumination changes along with a framework of a direct visual odometry for a rover localization. Furthermore, this study presents a self-calibrated visual-inertial odometry in which the time-offset and relative pose of an IMU and a stereo camera are estimated by using point feature measurements. Specifically, based on the extended Kalman filter pose estimator, the calibration parameters are augmented in the filter state. The proposed visual odometry is evaluated through the open source dataset where images are captured in a Lunar-like environment. In addition to this, we design a rover using commercially available sensors, and a field testing of the rover confirms that the self-calibrated visual-inertial odometry decreases a localization error in terms of a return position by 76.4% when compared to the visual-inertial odometry without the self-calibration.๋ณธ ๋…ผ๋ฌธ์—์„œ๋Š” ๋กœ๋ฒ„ ํ•ญ๋ฒ• ์‹œ์Šคํ…œ์„ ์œ„ํ•ด ๊ด€์„ฑ์ธก์ •์žฅ์น˜์™€ ์Šคํ…Œ๋ ˆ์˜ค ์นด๋ฉ”๋ผ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๋น› ๋ณ€ํ™”์— ๊ฐ•๊ฑดํ•œ ์ง์ ‘ ๋ฐฉ์‹ ์˜์ƒ ์˜ค๋„๋ฉ”ํŠธ๋ฆฌ์™€ ์ž๊ฐ€ ๋ณด์ • ์˜์ƒ๊ด€์„ฑ ํ•ญ๋ฒ• ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์ œ์•ˆํ•œ๋‹ค. ๊ธฐ์กด ๋Œ€๋ถ€๋ถ„์˜ ์˜์ƒ๊ธฐ๋ฐ˜ ํ•ญ๋ฒ• ์•Œ๊ณ ๋ฆฌ์ฆ˜๋“ค์€ ๋žจ๋ฒ„์…˜ ํ‘œ๋ฉด ๊ฐ€์ •์„ ์œ„๋ฐฐํ•˜๋Š” ์•ผ์™ธ์˜ ๊ฐ•ํ•œ ํ–‡๋น› ํ˜น์€ ์ผ์ •ํ•˜์ง€ ์•Š์€ ์นด๋ฉ”๋ผ์˜ ๋…ธ์ถœ ์‹œ๊ฐ„์œผ๋กœ ์ธํ•ด ์˜์ƒ์˜ ๋ฐ๊ธฐ ๋ณ€ํ™”์— ์ทจ์•ฝํ•˜์˜€๋‹ค. ํ•œํŽธ, ์˜์ƒ ์˜ค๋„๋ฉ”ํŠธ๋ฆฌ์˜ ์˜ค์ฐจ ๋ˆ„์ ์„ ์ค„์ด๊ธฐ ์œ„ํ•ด ๊ด€์„ฑ์ธก์ •์žฅ์น˜๋ฅผ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ์ง€๋งŒ, ์˜์ƒ๊ด€์„ฑ ์‹œ์Šคํ…œ์— ๋Œ€ํ•œ ์™ธ๋ถ€ ๊ต์ • ๋ณ€์ˆ˜๋Š” ๊ณต๊ฐ„ ๋ฐ ์‹œ๊ฐ„์ ์œผ๋กœ ์˜์ƒ ๋ฐ ๊ด€์„ฑ ์ขŒํ‘œ๊ณ„๋ฅผ ์—ฐ๊ฒฐํ•˜๊ธฐ ๋•Œ๋ฌธ์— ์‚ฌ์ „์— ์ •ํ™•ํ•˜๊ฒŒ ๊ณ„์‚ฐ๋˜์–ด์•ผ ํ•œ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์€ ๋กœ๋ฒ„ ํ•ญ๋ฒ•์„ ์œ„ํ•ด ์ง€์—ญ ๋ฐ ์ „์—ญ์ ์ธ ๋น› ๋ณ€ํ™”๋ฅผ ์„ค๋ช…ํ•˜๋Š” ์ง์ ‘ ๋ฐฉ์‹ ์˜์ƒ ์˜ค๋„๋ฉ”ํŠธ๋ฆฌ์˜ ๋ฒ„ํ‚ท ๋ฐ๊ธฐ ๋ชจ๋ธ์„ ์ œ์•ˆํ•œ๋‹ค. ๋˜ํ•œ, ๋ณธ ์—ฐ๊ตฌ์—์„œ๋Š” ์ŠคํŠธ๋ ˆ์˜ค ์นด๋ฉ”๋ผ์—์„œ ์ธก์ •๋œ ํŠน์ง•์ ์„ ์ด์šฉํ•˜์—ฌ ๊ด€์„ฑ์ธก์ •์žฅ์น˜์™€ ์นด๋ฉ”๋ผ๊ฐ„์˜ ์‹œ๊ฐ„ ์˜คํ”„์…‹๊ณผ ์ƒ๋Œ€ ์œ„์น˜ ๋ฐ ์ž์„ธ๋ฅผ ์ถ”์ •ํ•˜๋Š” ์ž๊ฐ€ ๋ณด์ • ์˜์ƒ๊ด€์„ฑ ํ•ญ๋ฒ• ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์ œ์‹œํ•œ๋‹ค. ํŠนํžˆ, ์ œ์•ˆํ•˜๋Š” ์˜์ƒ๊ด€์„ฑ ์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ํ™•์žฅ ์นผ๋งŒ ํ•„ํ„ฐ์— ๊ธฐ๋ฐ˜ํ•˜๋ฉฐ ๊ต์ • ํŒŒ๋ผ๋ฏธํ„ฐ๋ฅผ ํ•„ํ„ฐ์˜ ์ƒํƒœ๋ณ€์ˆ˜์— ํ™•์žฅํ•˜์˜€๋‹ค. ์ œ์•ˆํ•œ ์ง์ ‘๋ฐฉ์‹ ์˜์ƒ ์˜ค๋„๋ฉ”ํŠธ๋ฆฌ๋Š” ๋‹ฌ ์œ ์‚ฌํ™˜๊ฒฝ์—์„œ ์ดฌ์˜๋œ ์˜คํ”ˆ์†Œ์Šค ๋ฐ์ดํ„ฐ์…‹์„ ํ†ตํ•ด ๊ทธ ์„ฑ๋Šฅ์„ ๊ฒ€์ฆํ•˜์˜€๋‹ค. ๋˜ํ•œ ์ƒ์šฉ ์„ผ์„œ ๋ฐ ๋กœ๋ฒ„ ํ”Œ๋žซํผ์„ ์ด์šฉํ•˜์—ฌ ํ…Œ์ŠคํŠธ ๋กœ๋ฒ„๋ฅผ ์„ค๊ณ„ํ•˜์˜€๊ณ , ์ด๋ฅผ ํ†ตํ•ด ์˜์ƒ๊ด€์„ฑ ์‹œ์Šคํ…œ์„ ์ž๊ฐ€ ๋ณด์ • ํ•  ๊ฒฝ์šฐ ๊ทธ๋ ‡์ง€ ์•Š์€ ๊ฒฝ์šฐ ๋ณด๋‹ค ํšŒ๊ธฐ ์œ„์น˜ ์˜ค์ฐจ(return position error)๊ฐ€ 76.4% ๊ฐ์†Œ๋จ์„ ํ™•์ธํ•˜์˜€๋‹ค.Abstract Contents List of Tables List of Figures Chapter 1 Introduction 1.1 Motivation and background 1.2 Objectives and contributions Chapter 2 Related Works 2.1 Visual odometry 2.2 Visual-inertial odometry Chapter 3 Direct Visual Odometry at Outdoor 3.1 Direct visual odometry 3.1.1 Notations 3.1.2 Camera projection model 3.1.3 Photometric error 3.2 The proposed algorithm 3.2.1 Problem formulation 3.2.2 Bucketed illumination model 3.2.3 Adaptive prior weight 3.3 Experimental results 3.3.1 Synthetic image sequences 3.3.2 MAV datasets 3.3.3 Planetary rover datasets Chapter 4 Self-Calibrated Visual-Inertial Odometry 4.1 State representation 4.1.1 IMU state 4.1.2 Calibration parameter state 4.2 State-propagation 4.3 Measurement-update 4.3.1 Point feature measurement 4.3.2 Measurement error modeling 4.4 Experimental results 4.4.1 Hardware setup 4.4.2 Vision front-end design 4.4.3 Rover field testing Chapter 5 Conclusions 5.1 Conclusion and summary 5.2 Future works Bibliography Chapter A Derivation of Photometric Error Jacobian ๊ตญ๋ฌธ ์ดˆ๋กMaste

    Inspection with Robotic Microscopic Imaging

    Get PDF
    Future Mars rover missions will require more advanced onboard autonomy for increased scientific productivity and reduced mission operations cost. One such form of autonomy can be achieved by targeting precise science measurements to be made in a single command uplink cycle. In this paper we present an overview of our solution to the subproblems of navigating a rover into place for microscopic imaging, mapping an instrument target point selected by an operator using far away science camera images to close up hazard camera images, verifying the safety of placing a contact instrument on a sample or finding nearby safe points, and analyzing the data that comes back from the rover. The system developed includes portions used in the Multiple Target Single Cycle Instrument Placement demonstration at NASA Ames in October 2004, and portions of the MI Toolkit delivered to the Athena Microscopic Imager Instrument Team for the MER mission still operating on Mars today. Some of the component technologies are also under consideration for MSL mission infusion

    Human-Robot Site Survey and Sampling for Space Exploration

    Get PDF
    NASA is planning to send humans and robots back to the Moon before 2020. In order for extended missions to be productive, high quality maps of lunar terrain and resources are required. Although orbital images can provide much information, many features (local topography, resources, etc) will have to be characterized directly on the surface. To address this need, we are developing a system to perform site survey and sampling. The system includes multiple robots and humans operating in a variety of team configurations, coordinated via peer-to-peer human-robot interaction. In this paper, we present our system design and describe planned field tests

    Autonomous Systems, Robotics, and Computing Systems Capability Roadmap: NRC Dialogue

    Get PDF
    Contents include the following: Introduction. Process, Mission Drivers, Deliverables, and Interfaces. Autonomy. Crew-Centered and Remote Operations. Integrated Systems Health Management. Autonomous Vehicle Control. Autonomous Process Control. Robotics. Robotics for Solar System Exploration. Robotics for Lunar and Planetary Habitation. Robotics for In-Space Operations. Computing Systems. Conclusion

    Recent Advances in Image Restoration with Applications to Real World Problems

    Get PDF
    In the past few decades, imaging hardware has improved tremendously in terms of resolution, making widespread usage of images in many diverse applications on Earth and planetary missions. However, practical issues associated with image acquisition are still affecting image quality. Some of these issues such as blurring, measurement noise, mosaicing artifacts, low spatial or spectral resolution, etc. can seriously affect the accuracy of the aforementioned applications. This book intends to provide the reader with a glimpse of the latest developments and recent advances in image restoration, which includes image super-resolution, image fusion to enhance spatial, spectral resolution, and temporal resolutions, and the generation of synthetic images using deep learning techniques. Some practical applications are also included
    • โ€ฆ
    corecore