34 research outputs found

    CMIR-NET : A Deep Learning Based Model For Cross-Modal Retrieval In Remote Sensing

    Get PDF
    We address the problem of cross-modal information retrieval in the domain of remote sensing. In particular, we are interested in two application scenarios: i) cross-modal retrieval between panchromatic (PAN) and multi-spectral imagery, and ii) multi-label image retrieval between very high resolution (VHR) images and speech based label annotations. Notice that these multi-modal retrieval scenarios are more challenging than the traditional uni-modal retrieval approaches given the inherent differences in distributions between the modalities. However, with the growing availability of multi-source remote sensing data and the scarcity of enough semantic annotations, the task of multi-modal retrieval has recently become extremely important. In this regard, we propose a novel deep neural network based architecture which is considered to learn a discriminative shared feature space for all the input modalities, suitable for semantically coherent information retrieval. Extensive experiments are carried out on the benchmark large-scale PAN - multi-spectral DSRSID dataset and the multi-label UC-Merced dataset. Together with the Merced dataset, we generate a corpus of speech signals corresponding to the labels. Superior performance with respect to the current state-of-the-art is observed in all the cases

    Forward method for vertical air motion estimation from frequency modulated continuous wave radar rain measurements

    Get PDF
    Vertically-pointed Frequency-Modulated Continuous-Wave (FMCW) radar measurements of rain are greatly influenced by strong vertical winds (vertical air motion, VAM) in convective rain scenarios. Particularly, 2nd order products such as rain rate (RR) and drop size distribution (DSD) experience high estimation errors due to VAM. In this work, we consider the estimation of VAM from vertically-pointed FMCW radar measurements in order to correct VAM-corrupted rain 2nd order products. We present preliminary research on a forward method to estimate VAM velocity at a particular height from S-band FMCW radar measurements in convective rain scenarios. The method relies on the parameterization of the DSD as a gamma distribution. It estimates the VAM along with the constitutive parameters of the gamma distribution by means of a parametric solver. The methodology is tested over long-duration, high-resolution measurements by the University of Massachussetts FMCW radar and validated against a ground-based disdrometer in the context of the Verification of the Origins of Rotation in Tornadoes Experiment-Southeast (VORTEX-SE).Peer ReviewedPostprint (published version

    Gas Pressure Measurement Device and Medical Vacuum Design

    Get PDF
    Two methods are used in a digital pressure meter available in the market, namely positive pressure, and negative pressure. The positive one is used to measure the air pressure on a sphygmomanometer and medical gas pressure at an outlet in the treatment room so that operators can easily check medical gas pressure for the safety of the patient, or they can routinely check to find out how much pressure of the medical gas. Based on the background, the research aimed to design a digital pressure meter equipped with a medical gas measurement mode so that the device can be used to calibrate the sphygmomanometer, suction pump, and measure the medical gas pressure available at the medical gas outlet in each treatment room

    A comprehensive system for detection of flammable and toxic gases using IoT

    Get PDF
    The majority of the existing gases constitute a risk to the health of humans and the environment in general. There is a wide range of diseases that can arise as a result of exposure to toxic and dangerous gases, which can be a cause of death or serious health problems. More so, deadly explosions may occur as a result of leakages of such gases. However, such consequences can be avoided when these dangerous gases are not detected early. Many researchers have proposed different kinds of systems for the detection of gas leakage, but most of the proposed systems were mainly designed to detect LPG gas. Therefore, in this study, a system is proposed for detecting different kinds of flammable and toxic gases. The gases that are detectable by the proposed system include smoke, Ethanol, CNG Gas, Methane, toluene, propane, Carbon Monoxide, acetone, Hydrogen Gas, and Formaldehyde. The system can detect gases efficiently and release evacuation alarms simultaneously, then send SMS for emergencies. The proposed system is ready to use and can be installed at any work location

    Analysis of Multiple Scattering Characteristics of Cable-Stayed Bridges with Multi-band SAR

    Get PDF
    Accurate localization of multi-scattering features of cable-stayed bridges in multi-band Synthetic Aperture Radar (SAR) imagery is crucial for intelligent recognition of bridge targets within images, as well as for precise water level extraction. This study focuses on the Badong Yangtze River Bridge, utilizing Unmanned Aerial Vehicle (UAV) LiDAR data of the bridge, and analyzes the multi-scattering characteristics of different bridge structural targets based on Geometric Optics (GO) methods and the Range-Doppler principle. Furthermore, the study integrates LiDAR data of the bridge's cable-stays to examine their multi-scattering phenomena, finding that the undulations of the Yangtze River's surface waves significantly contribute to the pronounced double scattering features of the bridge's cable-stays. Additionally, statistical analysis of multi-source SAR data indicates that this phenomenon is not directly correlated with radar wavelength, implying no direct connection to surface roughness. Utilizing LiDAR point cloud data from the bridge's street lamps, this paper proposes a novel method for estimating water level elevation by identifying the center position of spots formed by double scattering from lamp posts. The results show that using TerraSAR ascending and descending orbit images, this method achieves a water level elevation accuracy of approximately 0.2 meters

    ๋ฌด์ธ๋น„ํ–‰์ฒด ํƒ‘์žฌ ์—ดํ™”์ƒ ๋ฐ ์‹คํ™”์ƒ ์ด๋ฏธ์ง€๋ฅผ ํ™œ์šฉํ•œ ์•ผ์ƒ๋™๋ฌผ ํƒ์ง€ ๊ฐ€๋Šฅ์„ฑ ์—ฐ๊ตฌ

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ(์„์‚ฌ) -- ์„œ์šธ๋Œ€ํ•™๊ต๋Œ€ํ•™์› : ํ™˜๊ฒฝ๋Œ€ํ•™์› ํ™˜๊ฒฝ์กฐ๊ฒฝํ•™๊ณผ, 2022.2. ์†ก์˜๊ทผ.์•ผ์ƒ๋™๋ฌผ์˜ ํƒ์ง€์™€ ๋ชจ๋‹ˆํ„ฐ๋ง์„ ์œ„ํ•ด, ํ˜„์žฅ ์ง์ ‘ ๊ด€์ฐฐ, ํฌํš-์žฌํฌํš๊ณผ ๊ฐ™์€ ์ „ํ†ต์  ์กฐ์‚ฌ ๋ฐฉ๋ฒ•์ด ๋‹ค์–‘ํ•œ ๋ชฉ์ ์œผ๋กœ ์ˆ˜ํ–‰๋˜์–ด์™”๋‹ค. ํ•˜์ง€๋งŒ, ์ด๋Ÿฌํ•œ ๋ฐฉ๋ฒ•๋“ค์€ ๋งŽ์€ ์‹œ๊ฐ„๊ณผ ์ƒ๋Œ€์ ์œผ๋กœ ๋น„์‹ผ ๋น„์šฉ์ด ํ•„์š”ํ•˜๋ฉฐ, ์‹ ๋ขฐ ๊ฐ€๋Šฅํ•œ ํƒ์ง€ ๊ฒฐ๊ณผ๋ฅผ ์–ป๊ธฐ ์œ„ํ•ด์„  ์ˆ™๋ จ๋œ ํ˜„์žฅ ์ „๋ฌธ๊ฐ€๊ฐ€ ํ•„์š”ํ•˜๋‹ค. ๊ฒŒ๋‹ค๊ฐ€, ์ „ํ†ต์ ์ธ ํ˜„์žฅ ์กฐ์‚ฌ ๋ฐฉ๋ฒ•์€ ํ˜„์žฅ์—์„œ ์•ผ์ƒ๋™๋ฌผ์„ ๋งˆ์ฃผ์น˜๋Š” ๋“ฑ ์œ„ํ—˜ํ•œ ์ƒํ™ฉ์— ์ฒ˜ํ•  ์ˆ˜ ์žˆ๋‹ค. ์ด์— ๋”ฐ๋ผ, ์นด๋ฉ”๋ผ ํŠธ๋ž˜ํ•‘, GPS ์ถ”์ , eDNA ์ƒ˜ํ”Œ๋ง๊ณผ ๊ฐ™์€ ์›๊ฒฉ ์กฐ์‚ฌ ๋ฐฉ๋ฒ•์ด ๊ธฐ์กด์˜ ์ „ํ†ต์  ์กฐ์‚ฌ๋ฐฉ๋ฒ•์„ ๋Œ€์ฒดํ•˜๋ฉฐ ๋”์šฑ ๋นˆ๋ฒˆํžˆ ์‚ฌ์šฉ๋˜๊ธฐ ์‹œ์ž‘ํ–ˆ๋‹ค. ํ•˜์ง€๋งŒ, ์ด๋Ÿฌํ•œ ๋ฐฉ๋ฒ•๋“ค์€ ์—ฌ์ „ํžˆ ๋ชฉํ‘œ๋กœ ํ•˜๋Š” ๋Œ€์ƒ์˜ ์ „์ฒด ๋ฉด์ ๊ณผ, ๊ฐœ๋ณ„ ๊ฐœ์ฒด๋ฅผ ํƒ์ง€ํ•  ์ˆ˜ ์—†๋‹ค๋Š” ํ•œ๊ณ„๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ๋‹ค. ์ด๋Ÿฌํ•œ ํ•œ๊ณ„๋ฅผ ๊ทน๋ณตํ•˜๊ธฐ ์œ„ํ•ด, ๋ฌด์ธ๋น„ํ–‰์ฒด (UAV, Unmanned Aerial Vehicle)๊ฐ€ ์•ผ์ƒ๋™๋ฌผ ํƒ์ง€์˜ ๋Œ€์ค‘์ ์ธ ๋„๊ตฌ๋กœ ์ž๋ฆฌ๋งค๊น€ํ•˜๊ณ  ์žˆ๋‹ค. UAV์˜ ๊ฐ€์žฅ ํฐ ์žฅ์ ์€, ์„ ๋ช…ํ•˜๊ณ  ์ด˜์ด˜ํ•œ ๊ณต๊ฐ„ ๋ฐ ์‹œ๊ฐ„ํ•ด์ƒ๋„์™€ ํ•จ๊ป˜ ์ „์ฒด ์—ฐ๊ตฌ ์ง€์—ญ์— ๋Œ€ํ•œ ๋™๋ฌผ ํƒ์ง€๊ฐ€ ๊ฐ€๋Šฅํ•˜๋‹ค๋Š” ๊ฒƒ์ด๋‹ค. ์ด์— ๋”ํ•ด, UAV๋ฅผ ์‚ฌ์šฉํ•จ์œผ๋กœ์จ, ์ ‘๊ทผํ•˜๊ธฐ ์–ด๋ ค์šด ์ง€์—ญ์ด๋‚˜ ์œ„ํ—˜ํ•œ ๊ณณ์— ๋Œ€ํ•œ ์กฐ์‚ฌ๊ฐ€ ๊ฐ€๋Šฅํ•ด์ง„๋‹ค. ํ•˜์ง€๋งŒ, ์ด๋Ÿฌํ•œ ์ด์  ์™ธ์—, UAV์˜ ๋‹จ์ ๋„ ๋ช…ํ™•ํžˆ ์กด์žฌํ•œ๋‹ค. ๋Œ€์ƒ์ง€, ๋น„ํ–‰ ์†๋„ ๋ฐ ๋†’์ด ๋“ฑ๊ณผ ๊ฐ™์ด UAV๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ํ™˜๊ฒฝ์— ๋”ฐ๋ผ, ์ž‘์€ ๋™๋ฌผ, ์šธ์ฐฝํ•œ ์ˆฒ์†์— ์žˆ๋Š” ๊ฐœ์ฒด, ๋น ๋ฅด๊ฒŒ ์›€์ง์ด๋Š” ๋™๋ฌผ์„ ํƒ์ง€ํ•˜๋Š” ๊ฒƒ์ด ์ œํ•œ๋œ๋‹ค. ๋˜ํ•œ, ๊ธฐ์ƒํ™˜๊ฒฝ์— ๋”ฐ๋ผ์„œ๋„ ๋น„ํ–‰์ด ๋ถˆ๊ฐ€ํ•  ์ˆ˜ ์žˆ๊ณ , ๋ฐฐํ„ฐ๋ฆฌ ์šฉ๋Ÿ‰์œผ๋กœ ์ธํ•œ ๋น„ํ–‰์‹œ๊ฐ„์˜ ์ œํ•œ๋„ ์กด์žฌํ•œ๋‹ค. ํ•˜์ง€๋งŒ, ์ •๋ฐ€ํ•œ ํƒ์ง€๊ฐ€ ๋ถˆ๊ฐ€๋Šฅํ•˜๋”๋ผ๋„, ์ด์™€ ๊ด€๋ จ ์—ฐ๊ตฌ๊ฐ€ ๊พธ์ค€ํžˆ ์ˆ˜ํ–‰๋˜๊ณ  ์žˆ์œผ๋ฉฐ, ์„ ํ–‰์—ฐ๊ตฌ๋“ค์€ ์œก์ƒ ๋ฐ ํ•ด์ƒ ํฌ์œ ๋ฅ˜, ์กฐ๋ฅ˜, ๊ทธ๋ฆฌ๊ณ  ํŒŒ์ถฉ๋ฅ˜ ๋“ฑ์„ ํƒ์ง€ํ•˜๋Š” ๋ฐ์— ์„ฑ๊ณตํ•˜์˜€๋‹ค. UAV๋ฅผ ํ†ตํ•ด ์–ป์–ด์ง€๋Š” ๊ฐ€์žฅ ๋Œ€ํ‘œ์ ์ธ ๋ฐ์ดํ„ฐ๋Š” ์‹คํ™”์ƒ ์ด๋ฏธ์ง€์ด๋‹ค. ์ด๋ฅผ ์‚ฌ์šฉํ•ด ๋จธ์‹ ๋Ÿฌ๋‹ ๋ฐ ๋”ฅ๋Ÿฌ๋‹ (ML-DL, Machine Learning and Deep Learning) ๋ฐฉ๋ฒ•์ด ์ฃผ๋กœ ์‚ฌ์šฉ๋˜๊ณ  ์žˆ๋‹ค. ์ด๋Ÿฌํ•œ ๋ฐฉ๋ฒ•์€ ์ƒ๋Œ€์ ์œผ๋กœ ์ •ํ™•ํ•œ ํƒ์ง€ ๊ฒฐ๊ณผ๋ฅผ ๋ณด์—ฌ์ฃผ์ง€๋งŒ, ํŠน์ • ์ข…์„ ํƒ์ง€ํ•  ์ˆ˜ ์žˆ๋Š” ๋ชจ๋ธ์˜ ๊ฐœ๋ฐœ์„ ์œ„ํ•ด์„  ์ตœ์†Œํ•œ ์ฒœ ์žฅ์˜ ์ด๋ฏธ์ง€๊ฐ€ ํ•„์š”ํ•˜๋‹ค. ์‹คํ™”์ƒ ์ด๋ฏธ์ง€ ์™ธ์—๋„, ์—ดํ™”์ƒ ์ด๋ฏธ์ง€ ๋˜ํ•œ UAV๋ฅผ ํ†ตํ•ด ํš๋“ ๋  ์ˆ˜ ์žˆ๋‹ค. ์—ดํ™”์ƒ ์„ผ์„œ ๊ธฐ์ˆ ์˜ ๊ฐœ๋ฐœ๊ณผ ์„ผ์„œ ๊ฐ€๊ฒฉ์˜ ํ•˜๋ฝ์€ ๋งŽ์€ ์•ผ์ƒ๋™๋ฌผ ์—ฐ๊ตฌ์ž๋“ค์˜ ๊ด€์‹ฌ์„ ์‚ฌ๋กœ์žก์•˜๋‹ค. ์—ดํ™”์ƒ ์นด๋ฉ”๋ผ๋ฅผ ์‚ฌ์šฉํ•˜๋ฉด ๋™๋ฌผ์˜ ์ฒด์˜จ๊ณผ ์ฃผ๋ณ€ํ™˜๊ฒฝ๊ณผ์˜ ์˜จ๋„ ์ฐจ์ด๋ฅผ ํ†ตํ•ด ์ •์˜จ๋™๋ฌผ์„ ํƒ์ง€ํ•˜๋Š” ๊ฒƒ์ด ๊ฐ€๋Šฅํ•˜๋‹ค. ํ•˜์ง€๋งŒ, ์ƒˆ๋กœ์šด ๋ฐ์ดํ„ฐ๊ฐ€ ์‚ฌ์šฉ๋˜๋”๋ผ๋„, ์—ฌ์ „ํžˆ ML-DL ๋ฐฉ๋ฒ•์ด ๋™๋ฌผ ํƒ์ง€์— ์ฃผ๋กœ ์‚ฌ์šฉ๋˜๊ณ  ์žˆ์œผ๋ฉฐ, ์ด๋Ÿฌํ•œ ๋ฐฉ๋ฒ•์€ UAV๋ฅผ ํ™œ์šฉํ•œ ์•ผ์ƒ๋™๋ฌผ์˜ ์‹ค์‹œ๊ฐ„ ํƒ์ง€๋ฅผ ์ œํ•œํ•œ๋‹ค. ๋”ฐ๋ผ์„œ, ๋ณธ ์—ฐ๊ตฌ๋Š” ์—ดํ™”์ƒ๊ณผ ์‹คํ™”์ƒ ์ด๋ฏธ์ง€๋ฅผ ํ™œ์šฉํ•œ ๋™๋ฌผ ์ž๋™ ํƒ์ง€ ๋ฐฉ๋ฒ•์˜ ๊ฐœ๋ฐœ๊ณผ, ๊ฐœ๋ฐœ๋œ ๋ฐฉ๋ฒ•์ด ์ด์ „ ๋ฐฉ๋ฒ•๋“ค์˜ ํ‰๊ท  ์ด์ƒ์˜ ์ •ํ™•๋„์™€ ํ•จ๊ป˜ ํ˜„์žฅ์—์„œ ์‹ค์‹œ๊ฐ„์œผ๋กœ ์‚ฌ์šฉ๋  ์ˆ˜ ์žˆ๋„๋ก ํ•˜๋Š” ๊ฒƒ์„ ๋ชฉํ‘œ๋กœ ํ•œ๋‹ค.For wildlife detection and monitoring, traditional methods such as direct observation and capture-recapture have been carried out for diverse purposes. However, these methods require a large amount of time, considerable expense, and field-skilled experts to obtain reliable results. Furthermore, performing a traditional field survey can result in dangerous situations, such as an encounter with wild animals. Remote monitoring methods, such as those based on camera trapping, GPS collars, and environmental DNA sampling, have been used more frequently, mostly replacing traditional survey methods, as the technologies have developed. But these methods still have limitations, such as the lack of ability to cover an entire region or detect individual targets. To overcome those limitations, the unmanned aerial vehicle (UAV) is becoming a popular tool for conducting a wildlife census. The main benefits of UAVs are able to detect animals remotely covering a wider region with clear and fine spatial and temporal resolutions. In addition, by operating UAVs investigate hard to access or dangerous areas become possible. However, besides these advantages, the limitations of UAVs clearly exist. By UAV operating environments such as study site, flying height or speed, the ability to detect small animals, targets in the dense forest, tracking fast-moving animals can be limited. And by the weather, operating UAV is unable, and the flight time is limited by the battery matters. Although detailed detection is unavailable, related researches are developing and previous studies used UAV to detect terrestrial and marine mammals, avian and reptile species. The most common type of data acquired by UAVs is RGB images. Using these images, machine-learning and deep-learning (MLโ€“DL) methods were mainly used for wildlife detection. MLโ€“DL methods provide relatively accurate results, but at least 1,000 images are required to develop a proper detection model for specific species. Instead of RGB images, thermal images can be acquired by a UAV. The development of thermal sensor technology and sensor price reduction has attracted the interest of wildlife researchers. Using a thermal camera, homeothermic animals can be detected based on the temperature difference between their bodies and the surrounding environment. Although the technology and data are new, the same MLโ€“DL methods were typically used for animal detection. These ML-DL methods limit the use of UAVs for real-time wildlife detection in the field. Therefore, this paper aims to develop an automated animal detection method with thermal and RGB image datasets and to utilize it under in situ conditions in real-time while ensuring the average-above detection ability of previous methods.Abstract I Contents IV List of Tables VII List of Figures VIII Chapter 1. Introduction 1 1.1 Research background 1 1.2 Research goals and objectives 10 1.2.1 Research goals 10 1.2.2 Research objectives 11 1.3 Theoretical background 13 1.3.1 Concept of the UAV 13 1.3.2 Concept of the thermal camera 13 Chapter 2. Methods 15 2.1 Study site 15 2.2 Data acquisition and preprocessing 16 2.2.1 Data acquisition 16 2.2.2 RGB lens distortion correction and clipping 19 2.2.3 Thermal image correction by fur color 21 2.2.4 Unnatural object removal 22 2.3 Animal detection 24 2.3.1 Sobel edge creation and contour generation 24 2.3.2 Object detection and sorting 26 Chapter 3. Results 30 3.1 Number of counted objects 31 3.2 Time costs of image types 33 Chapter 4. Discussion 36 4.1 Reference comparison 36 4.2 Instant detection 40 4.3 Supplemental usage 41 4.4 Utility of thermal sensors 42 4.5 Applications in other fields 43 Chapter 5. Conclusions 47 References 49 Appendix: Glossary 61 ์ดˆ๋ก 62์„

    Estimation of Surface Thermal Emissivity in a Vineyard for UAV Microbolometer Thermal Cameras Using NASA HyTES Hyperspectral Thermal, and Landsat and AggieAir Optical Data

    Get PDF
    Microbolometer thermal cameras in UAVs and manned aircraft allow for the acquisition of highresolution temperature data, which, along with optical reflectance, contributes to monitoring and modeling of agricultural and natural environments. Furthermore, these temperature measurements have facilitated the development of advanced models of crop water stress and evapotranspiration in precision agriculture and heat fluxes exchanges in small river streams and corridors. Microbolometer cameras capture thermal information at blackbody or radiometric settings (narrowband emissivity equates to unity). While it is customary that the modeler uses assumed emissivity values (e.g. 0.99โ€“ 0.96 for agricultural and environmental settings); some applications (e.g. Vegetation Health Index), and complex models such as energy balance-based models (e.g. evapotranspiration) could benefit from spatial estimates of surface emissivity for true or kinetic temperature mapping. In that regard, this work presents an analysis of the spectral characteristics of a microbolometer camera with regard to emissivity, along with a methodology to infer thermal emissivity spatially based on the spectral characteristics of the microbolometer camera. For this work, the MODIS UCBS Emissivity Library, NASA HyTES hyperspectral emissivity, Landsat, and Utah State University AggieAir UAV surface reflectance products are employed. The methodology is applied to a commercial vineyard agricultural setting located in Lodi, California, where HyTES, Landsat, and AggieAir UAV spatial data were collected in the 2014 growing season. Assessment of the microbolometer spectral response with regards to emissivity and emissivity modeling performance for the area of study are presented and discussed
    corecore