1,359 research outputs found

    Evaluation of Color Anomaly Detection in Multispectral Images For Synthetic Aperture Sensing

    Full text link
    In this article, we evaluate unsupervised anomaly detection methods in multispectral images obtained with a wavelength-independent synthetic aperture sensing technique, called Airborne Optical Sectioning (AOS). With a focus on search and rescue missions that apply drones to locate missing or injured persons in dense forest and require real-time operation, we evaluate runtime vs. quality of these methods. Furthermore, we show that color anomaly detection methods that normally operate in the visual range always benefit from an additional far infrared (thermal) channel. We also show that, even without additional thermal bands, the choice of color space in the visual range already has an impact on the detection results. Color spaces like HSV and HLS have the potential to outperform the widely used RGB color space, especially when color anomaly detection is used for forest-like environments.Comment: 12 pages, 6 figures, 3 table

    Synthetic Aperture Anomaly Imaging

    Full text link
    Previous research has shown that in the presence of foliage occlusion, anomaly detection performs significantly better in integral images resulting from synthetic aperture imaging compared to applying it to conventional aerial images. In this article, we hypothesize and demonstrate that integrating detected anomalies is even more effective than detecting anomalies in integrals. This results in enhanced occlusion removal, outlier suppression, and higher chances of visually as well as computationally detecting targets that are otherwise occluded. Our hypothesis was validated through both: simulations and field experiments. We also present a real-time application that makes our findings practically available for blue-light organizations and others using commercial drone platforms. It is designed to address use-cases that suffer from strong occlusion caused by vegetation, such as search and rescue, wildlife observation, early wildfire detection, and sur-veillance

    Stereoscopic Depth Perception Through Foliage

    Full text link
    Both humans and computational methods struggle to discriminate the depths of objects hidden beneath foliage. However, such discrimination becomes feasible when we combine computational optical synthetic aperture sensing with the human ability to fuse stereoscopic images. For object identification tasks, as required in search and rescue, wildlife observation, surveillance, and early wildfire detection, depth assists in differentiating true from false findings, such as people, animals, or vehicles vs. sun-heated patches at the ground level or in the tree crowns, or ground fires vs. tree trunks. We used video captured by a drone above dense woodland to test users' ability to discriminate depth. We found that this is impossible when viewing monoscopic video and relying on motion parallax. The same was true with stereoscopic video because of the occlusions caused by foliage. However, when synthetic aperture sensing was used to reduce occlusions and disparity-scaled stereoscopic video was presented, whereas computational (stereoscopic matching) methods were unsuccessful, human observers successfully discriminated depth. This shows the potential of systems which exploit the synergy between computational methods and human vision to perform tasks that neither can perform alone

    Sand and Dust on Mars

    Get PDF
    Mars is a planet of high scientific interest. Various studies are currently being made that involve vehicles that have landed on Mars. Because Mars is known to experience frequent wind storms, mission planners and engineers require knowledge of the physical and chemical properties of Martian windblown sand and dust, and the processes involved in the origin and evolution of sand and dust storms

    Electronic Quenching of the A(0\u3csup\u3e+\u3c/sup\u3e\u3csub\u3eu\u3c/sub\u3e) State of Bi\u3csub\u3e2\u3c/sub\u3e

    Get PDF
    Temporally-resolved laser induced fluorescence ofhigh vibrational levels in Bi2 A(0+u) above and below the predissociation limit of v\u27=22 were investigated by observing total fluorescence from a wavelength tunable, pulsed dye laser. Electronic quenching ofBi2 A(0+u) by five collision partners (Ne, Ar, Kr, Xe, N2) was examined for four vibrational levels (v\u27=22,23,24,25). Electronic quenching by a sixth collision partner (He) was examined for eight vibrational levels (v\u27=18 through 25). The quenching from stable vibrational levels (v\u27≤22) was independent ofvibrational quantum number. A significant increase in quenching occurs for the predissociated level v\u27=23. Electronic quenching transfer rates ranged from 227.3 to 850.5x10-13 cm3 molec-1 sec-1 for v\u27=22 and from 741.2 to 1570x10-13 cm3 molec-1 sec-1 for v\u27=23, and are very nearly gas kinetic for v\u27=23. Electronic quenching ofhigher vibrational levels (v\u27\u3e23) was not temporally resolvable by the experimental apparatus
    • …
    corecore