32 research outputs found

    Fusion of visible and thermal images improves automated detection and classification of animals for drone surveys

    Get PDF
    Visible and thermal images acquired from drones (unoccupied aircraft systems) have substantially improved animal monitoring. Combining complementary information from both image types provides a powerful approach for automating detection and classification of multiple animal species to augment drone surveys. We compared eight image fusion methods using thermal and visible drone images combined with two supervised deep learning models, to evaluate the detection and classification of white-tailed deer (Odocoileus virginianus), domestic cow (Bos taurus), and domestic horse (Equus caballus). We classified visible and thermal images separately and compared them with the results of image fusion. Fused images provided minimal improvement for cows and horses compared to visible images alone, likely because the size, shape, and color of these species made them conspicuous against the background. For white-tailed deer, which were typically cryptic against their backgrounds and often in shadows in visible images, the added information from thermal images improved detection and classification in fusion methods from 15 to 85%. Our results suggest that image fusion is ideal for surveying animals inconspicuous from their backgrounds, and our approach uses few image pairs to train compared to typical machine-learning methods. We discuss computational and field considerations to improve drone surveys using our fusion approach. Supplemental files attached below

    Relief Displacement of Airborne Objects

    Get PDF
    The increasing availability of unoccupied aircraft systems (UAS, also referred to as drones) has led to their use in taking vertical aerial photographs at relatively small spatial scales. These photographs can be used to measure the distances between objects appearing in the photographs. However, relief displacement can cause an object above or below ground level to appear at a point in a vertical aerial photograph that is not directly in-line with the object’s actual location, causing a measurement error. A UAS was used in this study as a photographed airborne object because its location and altitude could be controlled. We were interested in predicting the horizontal distance of the UAS’s appearance from the centre of a vertical aerial photograph. Predictions of the location of the photographed UAS’s appearance in vertical aerial photographs over both level and sloped surfaces matched measured appearance distances within 0.06–0.48 m. This study shows that the relief displacement formulas typically used to compute the height of a vertical structure appearing in a vertical aerial photograph can additionally be used to compute the actual location of an airborne object (e.g., a flying UAS, bird, bat) if the object’s altitude is known or can be estimated

    On the bounds of separability in sensor networks

    No full text

    FOPID Controlled High Step-Up Super Lift DC-DC Converter With Enhanced Response

    No full text
    corecore