20,482 research outputs found

    Airborne derivation of microburst alerts from ground-based Terminal Doppler Weather Radar information: A flight evaluation

    Get PDF
    An element of the NASA/FAA windshear program is the integration of ground-based microburst information on the flight deck, to support airborne windshear alerting and microburst avoidance. NASA conducted a windshear flight test program in the summer of 1991 during which airborne processing of Terminal Doppler Weather Radar (TDWR) data was used to derive microburst alerts. Microburst information was extracted from TDWR, transmitted to a NASA Boeing 737 in flight via data link, and processed to estimate the windshear hazard level (F-factor) that would be experienced by the aircraft in each microburst. The microburst location and F-factor were used to derive a situation display and alerts. The situation display was successfully used to maneuver the aircraft for microburst penetrations, during which atmospheric 'truth' measurements were made. A total of 19 penetrations were made of TDWR-reported microburst locations, resulting in 18 airborne microburst alerts from the TDWR data and two microburst alerts from the airborne reactive windshear detection system. The primary factors affecting alerting performance were spatial offset of the flight path from the region of strongest shear, differences in TDWR measurement altitude and airplane penetration altitude, and variations in microburst outflow profiles. Predicted and measured F-factors agreed well in penetrations near microburst cores. Although improvements in airborne and ground processing of the TDWR measurements would be required to support an airborne executive-level alerting protocol, the practicality of airborne utilization of TDWR data link data has been demonstrated

    Four years of multi-modal odometry and mapping on the rail vehicles

    Full text link
    Precise, seamless, and efficient train localization as well as long-term railway environment monitoring is the essential property towards reliability, availability, maintainability, and safety (RAMS) engineering for railroad systems. Simultaneous localization and mapping (SLAM) is right at the core of solving the two problems concurrently. In this end, we propose a high-performance and versatile multi-modal framework in this paper, targeted for the odometry and mapping task for various rail vehicles. Our system is built atop an inertial-centric state estimator that tightly couples light detection and ranging (LiDAR), visual, optionally satellite navigation and map-based localization information with the convenience and extendibility of loosely coupled methods. The inertial sensors IMU and wheel encoder are treated as the primary sensor, which achieves the observations from subsystems to constrain the accelerometer and gyroscope biases. Compared to point-only LiDAR-inertial methods, our approach leverages more geometry information by introducing both track plane and electric power pillars into state estimation. The Visual-inertial subsystem also utilizes the environmental structure information by employing both lines and points. Besides, the method is capable of handling sensor failures by automatic reconfiguration bypassing failure modules. Our proposed method has been extensively tested in the long-during railway environments over four years, including general-speed, high-speed and metro, both passenger and freight traffic are investigated. Further, we aim to share, in an open way, the experience, problems, and successes of our group with the robotics community so that those that work in such environments can avoid these errors. In this view, we open source some of the datasets to benefit the research community

    Mapping Wide Row Crops with Video Sequences Acquired from a Tractor Moving at Treatment Speed

    Get PDF
    This paper presents a mapping method for wide row crop fields. The resulting map shows the crop rows and weeds present in the inter-row spacing. Because field videos are acquired with a camera mounted on top of an agricultural vehicle, a method for image sequence stabilization was needed and consequently designed and developed. The proposed stabilization method uses the centers of some crop rows in the image sequence as features to be tracked, which compensates for the lateral movement (sway) of the camera and leaves the pitch unchanged. A region of interest is selected using the tracked features, and an inverse perspective technique transforms the selected region into a bird’s-eye view that is centered on the image and that enables map generation. The algorithm developed has been tested on several video sequences of different fields recorded at different times and under different lighting conditions, with good initial results. Indeed, lateral displacements of up to 66% of the inter-row spacing were suppressed through the stabilization process, and crop rows in the resulting maps appear straight
    • 

    corecore