28 research outputs found

    Evaluating Visual Odometry Methods for Autonomous Driving in Rain

    Full text link
    The increasing demand for autonomous vehicles has created a need for robust navigation systems that can also operate effectively in adverse weather conditions. Visual odometry is a technique used in these navigation systems, enabling the estimation of vehicle position and motion using input from onboard cameras. However, visual odometry accuracy can be significantly impacted in challenging weather conditions, such as heavy rain, snow, or fog. In this paper, we evaluate a range of visual odometry methods, including our DROIDSLAM based heuristic approach. Specifically, these algorithms are tested on both clear and rainy weather urban driving data to evaluate their robustness. We compiled a dataset comprising of a range of rainy weather conditions from different cities. This includes, the Oxford Robotcar dataset from Oxford, the 4Seasons dataset from Munich and an internal dataset collected in Singapore. We evaluated different visual odometry algorithms for both monocular and stereo camera setups using the Absolute Trajectory Error (ATE). Our evaluation suggests that the Depth and Flow for Visual Odometry (DF-VO) algorithm with monocular setup worked well for short range distances (< 500m) and our proposed DROID-SLAM based heuristic approach for the stereo setup performed relatively well for long-term localization. Both algorithms performed consistently well across all rain conditions.Comment: 8 pages, 4 figures, Accepted at IEEE International Conference on Automation Science and Engineering (CASE) 202

    Enhancing Road Infrastructure Monitoring: Integrating Drones for Weather-Aware Pothole Detection

    Get PDF
    The abstract outlines the research proposal focused on the utilization of Unmanned Aerial Vehicles (UAVs) for monitoring potholes in road infrastructure affected by various weather conditions. The study aims to investigate how different materials used to fill potholes, such as water, grass, sand, and snow-ice, are impacted by seasonal weather changes, ultimately affecting the performance of pavement structures. By integrating weather-aware monitoring techniques, the research seeks to enhance the rigidity and resilience of road surfaces, thereby contributing to more effective pavement management systems. The proposed methodology involves UAV image-based monitoring combined with advanced super-resolution algorithms to improve image refinement, particularly at high flight altitudes. Through case studies and experimental analysis, the study aims to assess the geometric precision of 3D models generated from aerial images, with a specific focus on road pavement distress monitoring. Overall, the research aims to address the challenges of traditional road failure detection methods by exploring cost-effective 3D detection techniques using UAV technology, thereby ensuring safer roadways for all users

    Unmanned Aircraft Systems in the Cyber Domain

    Get PDF
    Unmanned Aircraft Systems are an integral part of the US national critical infrastructure. The authors have endeavored to bring a breadth and quality of information to the reader that is unparalleled in the unclassified sphere. This textbook will fully immerse and engage the reader / student in the cyber-security considerations of this rapidly emerging technology that we know as unmanned aircraft systems (UAS). The first edition topics covered National Airspace (NAS) policy issues, information security (INFOSEC), UAS vulnerabilities in key systems (Sense and Avoid / SCADA), navigation and collision avoidance systems, stealth design, intelligence, surveillance and reconnaissance (ISR) platforms; weapons systems security; electronic warfare considerations; data-links, jamming, operational vulnerabilities and still-emerging political scenarios that affect US military / commercial decisions. This second edition discusses state-of-the-art technology issues facing US UAS designers. It focuses on counter unmanned aircraft systems (C-UAS) – especially research designed to mitigate and terminate threats by SWARMS. Topics include high-altitude platforms (HAPS) for wireless communications; C-UAS and large scale threats; acoustic countermeasures against SWARMS and building an Identify Friend or Foe (IFF) acoustic library; updates to the legal / regulatory landscape; UAS proliferation along the Chinese New Silk Road Sea / Land routes; and ethics in this new age of autonomous systems and artificial intelligence (AI).https://newprairiepress.org/ebooks/1027/thumbnail.jp

    Roadmap on measurement technologies for next generation structural health monitoring systems

    Get PDF
    Structural health monitoring (SHM) is the automation of the condition assessment process of an engineered system. When applied to geometrically large components or structures, such as those found in civil and aerospace infrastructure and systems, a critical challenge is in designing the sensing solution that could yield actionable information. This is a difficult task to conduct cost-effectively, because of the large surfaces under consideration and the localized nature of typical defects and damages. There have been significant research efforts in empowering conventional measurement technologies for applications to SHM in order to improve performance of the condition assessment process. Yet, the field implementation of these SHM solutions is still in its infancy, attributable to various economic and technical challenges. The objective of this Roadmap publication is to discuss modern measurement technologies that were developed for SHM purposes, along with their associated challenges and opportunities, and to provide a path to research and development efforts that could yield impactful field applications. The Roadmap is organized into four sections: distributed embedded sensing systems, distributed surface sensing systems, multifunctional materials, and remote sensing. Recognizing that many measurement technologies may overlap between sections, we define distributed sensing solutions as those that involve or imply the utilization of numbers of sensors geometrically organized within (embedded) or over (surface) the monitored component or system. Multi-functional materials are sensing solutions that combine multiple capabilities, for example those also serving structural functions. Remote sensing are solutions that are contactless, for example cell phones, drones, and satellites. It also includes the notion of remotely controlled robots

    Roadmap on measurement technologies for next generation structural health monitoring systems

    Get PDF
    Structural health monitoring (SHM) is the automation of the condition assessment process of an engineered system. When applied to geometrically large components or structures, such as those found in civil and aerospace infrastructure and systems, a critical challenge is in designing the sensing solution that could yield actionable information. This is a difficult task to conduct cost-effectively, because of the large surfaces under consideration and the localized nature of typical defects and damages. There have been significant research efforts in empowering conventional measurement technologies for applications to SHM in order to improve performance of the condition assessment process. Yet, the field implementation of these SHM solutions is still in its infancy, attributable to various economic and technical challenges. The objective of this Roadmap publication is to discuss modern measurement technologies that were developed for SHM purposes, along with their associated challenges and opportunities, and to provide a path to research and development efforts that could yield impactful field applications. The Roadmap is organized into four sections: distributed embedded sensing systems, distributed surface sensing systems, multifunctional materials, and remote sensing. Recognizing that many measurement technologies may overlap between sections, we define distributed sensing solutions as those that involve or imply the utilization of numbers of sensors geometrically organized within (embedded) or over (surface) the monitored component or system. Multi-functional materials are sensing solutions that combine multiple capabilities, for example those also serving structural functions. Remote sensing are solutions that are contactless, for example cell phones, drones, and satellites. It also includes the notion of remotely controlled robots

    Automated Image Interpretation for Science Autonomy in Robotic Planetary Exploration

    Get PDF
    Advances in the capabilities of robotic planetary exploration missions have increased the wealth of scientific data they produce, presenting challenges for mission science and operations imposed by the limits of interplanetary radio communications. These data budget pressures can be relieved by increased robotic autonomy, both for onboard operations tasks and for decision- making in response to science data. This thesis presents new techniques in automated image interpretation for natural scenes of relevance to planetary science and exploration, and elaborates autonomy scenarios under which they could be used to extend the reach and performance of exploration missions on planetary surfaces. Two computer vision techniques are presented. The first is an algorithm for autonomous classification and segmentation of geological scenes, allowing a photograph of a rock outcrop to be automatically divided into regions by rock type. This important task, currently performed by specialists on Earth, is a prerequisite to decisions about instrument pointing, data triage, and event-driven operations. The approach uses a novel technique to seek distinct visual regions in outcrop photographs. It first generates a feature space by extracting multiple types of visual information from the image. Then, in a training step using labeled exemplar scenes, it applies Mahalanobis distance metric learning (in particular, Multiclass Linear Discriminant Analysis) to discover the linear transformation of the feature space which best separates the geological classes. With the learned representation applied, a vector clustering technique is then used to segment new scenes. The second technique interrogates sequences of images of the sky to extract, from the motion of clouds, the wind vector at the condensation level — a measurement not normally available for Mars. To account for the deformation of clouds and the ephemerality of their fine-scale features, a template-matching technique (normalized cross-correlation) is used to mutually register images and compute the clouds’ motion. Both techniques are tested successfully on imagery from a variety of relevant analogue environments on Earth, and on data returned from missions to the planet Mars. For both, scenarios are elaborated for their use in autonomous science data interpretation, and to thereby automate certain steps in the process of robotic exploration

    The 1995 NASA High-Speed Research Program Sonic Boom Workshop

    Get PDF
    The High-Speed Research Program and NASA Langley Research Center sponsored the NASA High-Speed Research Program Sonic Boom Workshop on September 12-13, 1995. The workshop was designed to bring together NASAs scientists and engineers and their counterparts in industry, other Government agencies, and academia working together in the sonic boom element of NASAs High-Speed Research Program. Specific objectives of this workshop were to (1) report the progress and status of research in sonic boom propagation, acceptability, and design; (2) promote and disseminate this technology within the appropriate technical communities; (3) help promote synergy among the scientists working in the Program; and (4) identify technology pacing the development of viable reduced-boom High-Speed Civil Transport concepts. The Workshop included these sessions: Session 1 - Sonic Boom Propagation (Theoretical); Session 2 - Sonic Boom Propagation (Experimental); and Session 3 - Acceptability Studies - Human and Animal

    Abstracts on Radio Direction Finding (1899 - 1995)

    Get PDF
    The files on this record represent the various databases that originally composed the CD-ROM issue of "Abstracts on Radio Direction Finding" database, which is now part of the Dudley Knox Library's Abstracts and Selected Full Text Documents on Radio Direction Finding (1899 - 1995) Collection. (See Calhoun record https://calhoun.nps.edu/handle/10945/57364 for further information on this collection and the bibliography). Due to issues of technological obsolescence preventing current and future audiences from accessing the bibliography, DKL exported and converted into the three files on this record the various databases contained in the CD-ROM. The contents of these files are: 1) RDFA_CompleteBibliography_xls.zip [RDFA_CompleteBibliography.xls: Metadata for the complete bibliography, in Excel 97-2003 Workbook format; RDFA_Glossary.xls: Glossary of terms, in Excel 97-2003 Workbookformat; RDFA_Biographies.xls: Biographies of leading figures, in Excel 97-2003 Workbook format]; 2) RDFA_CompleteBibliography_csv.zip [RDFA_CompleteBibliography.TXT: Metadata for the complete bibliography, in CSV format; RDFA_Glossary.TXT: Glossary of terms, in CSV format; RDFA_Biographies.TXT: Biographies of leading figures, in CSV format]; 3) RDFA_CompleteBibliography.pdf: A human readable display of the bibliographic data, as a means of double-checking any possible deviations due to conversion

    Autonomous Navigation in Inclement Weather based on a Localizing Ground Penetrating Radar

    No full text
    Most autonomous driving solutions require some method of localization within their environment. Typically, onboard sensors are used to localize the vehicle precisely in a previously recorded map. However, these solutions are sensitive to ambient lighting conditions such as darkness and inclement weather. Additionally, the maps can become outdated in a rapidly changing environment and require continuous updating. While LiDAR systems don't require visible light, they are sensitive to weather such as fog or snow, which can interfere with localization. In this letter, we utilize a Ground Penetrating Radar (GPR) to obtain precise vehicle localization. By mapping and localizing using features beneath the ground, we obtain features that are both stable over time, and maintain their appearance during changing ambient weather and lighting conditions. We incorporate this solution into a full-scale autonomous vehicle and evaluate the performance on over 17 km of testing data in a variety of challenging weather conditions. We find that this novel sensing modality is capable of providing precise localization for autonomous navigation without using cameras or LiDAR sensors
    corecore