2,324 research outputs found
Improving Drone Imagery For Computer Vision/Machine Learning in Wilderness Search and Rescue
This paper describes gaps in acquisition of drone imagery that impair the use
with computer vision/machine learning (CV/ML) models and makes five
recommendations to maximize image suitability for CV/ML post-processing. It
describes a notional work process for the use of drones in wilderness search
and rescue incidents. The large volume of data from the wide area search phase
offers the greatest opportunity for CV/ML techniques because of the large
number of images that would otherwise have to be manually inspected. The 2023
Wu-Murad search in Japan, one of the largest missing person searches conducted
in that area, serves as a case study. Although drone teams conducting wide area
searches may not know in advance if the data they collect is going to be used
for CV/ML post-processing, there are data collection procedures that can
improve the search in general with automated collection software. If the drone
teams do expect to use CV/ML, then they can exploit knowledge about the model
to further optimize flights.Comment: 6 pages, 4 figure
Notes on the Implications of Ignoring Bayes’ Rule in Search and Rescue Practice in the UK.
Thomas Bayes (1702-1761) has had a big influence on the science of inference since he discovered the mathematically correct way of adjusting probabilities to account for new evidence. Nonetheless, it is still the case that in practice it is not always clear where and when to apply the rule he derived, or the consequences of not doing so. In this note, the effects of not doing so when searching an area of ground for a missing person (misper), where the chances of finding them depends both on whether they are there and how well the ground is searched, is investigated. This investigation suggests that within the range of probabilities that generally apply to search operations in rural settings in the UK, the widespread failure to apply Bayes’ rule may incline search managers to widen search areas more than is warranted by the evidence and may thereby reduce overall search effectiveness (ceteris paribus)
Autonomous, Collaborative, Unmanned Aerial Vehicles for Search and Rescue
Search and Rescue is a vitally important subject, and one which can be improved through the use of modern technology. This work presents a number of advances aimed towards the creation of a swarm of autonomous, collaborative, unmanned aerial vehicles for land-based search and rescue. The main advances are the development of a diffusion based search strategy for route planning, research into GPS (including the Durham Tracker Project and statistical research into altitude errors), and the creation of a relative positioning system (including discussion of the errors caused by fast-moving units). Overviews are also given of the current state of research into both UAVs and Search and Rescue
The application of GIScience to Search and Rescue in Yosemite National Park
Park Ranger & GIS Specialist, National Park Service
PhD student, University of California, Merced
http://www.esri.com/news/arcuser/0609/yosar.htmlPlatinum Sponsors
* KU Department of Geography
* Coca-Cola
Gold Sponsors
* KU Institute for Policy & Social Research
* State of Kansas Data Access and Support Center (DASC)
* KU Libraries GIS and Data Services
* Wilson & Company Engineers and Architects
Silver Sponsors
* ASPRS - Central Region
* Bartlett & West
* C-CHANGE Program (NSF IGERT)
* Garmin
* Kansas Applied Remote Sensing Program
* KansasView
* KU Transportation Research Institute
* KU Biodiversity Institute
Bronze Sponsors
* KU Center for Remote Sensing of Ice Sheets (CReSIS)
* KU Center for Global & International Studies
* KU Environmental Studies Progra
Event detection from novel data sources: Leveraging satellite imagery alongside GPS traces
Rapid identification and response to breaking events, particularly those that
pose a threat to human life such as natural disasters or conflicts, is of
paramount importance. The prevalence of mobile devices and the ubiquity of
network connectivity has generated a massive amount of temporally- and
spatially-stamped data. Numerous studies have used mobile data to derive
individual human mobility patterns for various applications. Similarly, the
increasing number of orbital satellites has made it easier to gather
high-resolution images capturing a snapshot of a geographical area in sub-daily
temporal frequency. We propose a novel data fusion methodology integrating
satellite imagery with privacy-enhanced mobile data to augment the event
inference task, whether in real-time or historical. In the absence of boots on
the ground, mobile data is able to give an approximation of human mobility,
proximity to one another, and the built environment. On the other hand,
satellite imagery can provide visual information on physical changes to the
built and natural environment. The expected use cases for our methodology
include small-scale disaster detection (i.e., tornadoes, wildfires, and floods)
in rural regions, search and rescue operation augmentation for lost hikers in
remote wilderness areas, and identification of active conflict areas and
population displacement in war-torn states. Our implementation is open-source
on GitHub: https://github.com/ekinugurel/SatMobFusion
- …