1,031 research outputs found
Cataloging Public Objects Using Aerial and Street-Level Images β Urban Trees
Each corner of the inhabited world is imaged from multiple viewpoints with increasing frequency. Online map services like Google Maps or Here Maps provide direct access to huge amounts of densely sampled, georeferenced images from street view and aerial perspective. There is an opportunity to design computer vision systems that will help us search, catalog and monitor public infrastructure, buildings and artifacts. We explore the architecture and feasibility of such a system. The main technical challenge is combining test time information from multiple views of each geographic location (e.g., aerial and street views). We implement two modules: det2geo, which detects the set of locations of objects belonging to a given category, and geo2cat, which computes the fine-grained category of the object at a given location. We introduce a solution that adapts state-of-the-art CNN-based object detectors and classifiers. We test our method on βPasadena Urban Treesβ, a new dataset of 80,000 trees with geographic and species annotations, and show that combining multiple views significantly improves both tree detection and tree species classification, rivaling human performance
λ€μ€ μΌμ± νλ«νΌκ³Ό λ₯λ¬λμ νμ©ν λμ κ·λͺ¨μ μλͺ© 맡ν λ° μμ’ νμ§
νμλ
Όλ¬Έ(μμ¬) -- μμΈλνκ΅λνμ : λμ
μλͺ
κ³Όνλν μνμ‘°κ²½Β·μ§μμμ€ν
곡νλΆ(μνμ‘°κ²½ν), 2023. 2. λ₯μλ ¬.Precise estimation of the number of trees and individual tree location with species information all over the city forms solid foundation for enhancing ecosystem service. However, mapping individual trees at the city scale remains challenging due to heterogeneous patterns of urban tree distribution. Here, we present a novel framework for merging multiple sensing platforms with leveraging various deep neural networks to produce a fine-grained urban tree map. We performed mapping trees and detecting species by relying only on RGB images taken by multiple sensing platforms such as airborne, citizens and vehicles, which fueled six deep learning models. We divided the entire process into three steps, since each platform has its own strengths. First, we produced individual tree location maps by converting the central points of the bounding boxes into actual coordinates from airborne imagery. Since many trees were obscured by the shadows of the buildings, we applied Generative Adversarial Network (GAN) to delineate hidden trees from the airborne images. Second, we selected tree bark photos collected by citizen for species mapping in urban parks and forests. Species information of all tree bark photos were automatically classified after non-tree parts of images were segmented. Third, we classified species of roadside trees by using a camera mounted on a car to augment our species mapping framework with street-level tree data. We estimated the distance from a car to street trees from the number of lanes detected from the images. Finally, we assessed our results by comparing it with Light Detection and Ranging (LiDAR), GPS and field data. We estimated over 1.2 million trees existed in the city of 121.04 kmΒ² and generated more accurate individual tree positions, outperforming the conventional field survey methods. Among them, we detected the species of more than 63,000 trees. The most frequently detected species was Prunus yedoensis (21.43 %) followed by Ginkgo biloba (19.44 %), Zelkova serrata (18.68 %), Pinus densiflora (7.55 %) and Metasequoia glyptostroboides (5.97 %). Comprehensive experimental results demonstrate that tree bark photos and street-level imagery taken by citizens and vehicles are conducive to delivering accurate and quantitative information on the distribution of urban tree species.λμ μ μμ μ‘΄μ¬νλ λͺ¨λ μλͺ©μ μ«μμ κ°λ³ μμΉ, κ·Έλ¦¬κ³ μμ’
λΆν¬λ₯Ό μ ννκ² νμ
νλ κ²μ μνκ³ μλΉμ€λ₯Ό ν₯μμν€κΈ° μν νμ쑰건μ΄λ€. νμ§λ§, λμμμλ μλͺ©μ λΆν¬κ° λ§€μ° λ³΅μ‘νκΈ° λλ¬Έμ κ°λ³ μλͺ©μ 맡ννλ κ²μ μ΄λ €μ λ€. λ³Έ μ°κ΅¬μμλ, μ¬λ¬κ°μ§ μΌμ± νλ«νΌμ μ΅ν©ν¨κ³Ό λμμ λ€μν λ₯λ¬λ λ€νΈμν¬λ€μ νμ©νμ¬ μΈλ°ν λμ μλͺ© μ§λλ₯Ό μ μνλ μλ‘μ΄ νλ μμν¬λ₯Ό μ μνλ€. μ°λ¦¬λ μ€μ§ ν곡μ¬μ§, μλ―Ό, μ°¨λ λ±μ νλ«νΌμΌλ‘λΆν° μμ§λ RGB μ΄λ―Έμ§λ§μ μ¬μ©νμμΌλ©°, 6κ°μ§ λ₯λ¬λ λͺ¨λΈμ νμ©νμ¬ μλͺ©μ 맡ννκ³ μμ’
μ νμ§νμλ€. κ°κ°μ νλ«νΌμ μ λ§λ€μ κ°μ μ΄ μκΈ° λλ¬Έμ μ κ³Όμ μ μΈ κ°μ§ μ€ν
μΌλ‘ ꡬλΆν μ μλ€. 첫째, μ°λ¦¬λ ν곡μ¬μ§ μμμ νμ§λ μλͺ©μ λ₯λ¬λ λ°μ΄λ© λ°μ€λ‘λΆν° μ€μ¬μ μ μΆμΆνμ¬ κ°λ³ μλͺ©μ μμΉ μ§λλ₯Ό μ μνμλ€. λ§μ μλͺ©μ΄ λμ λ΄ κ³ μΈ΅ λΉλ©μ κ·Έλ¦Όμμ μν΄ κ°λ €μ‘κΈ° λλ¬Έμ, μ°λ¦¬λ μμ μ μ λμ μ κ²½λ§ (Generative Adversarial Network, GAN)μ ν΅ν΄ ν곡μ¬μ§ μμ μ¨κ²¨μ§ μλͺ©μ κ·Έλ €λ΄κ³ μ νμλ€. λμ§Έ, μ°λ¦¬λ μλ―Όλ€μ΄ μμ§ν μλͺ©μ μνΌ μ¬μ§μ νμ©νμ¬ λμ 곡μ λ° λμ μ² μΌλμ μμ’
μ 보λ₯Ό 맡ννμλ€. μνΌ μ¬μ§μΌλ‘λΆν°μ μμ’
μ 보λ λ₯λ¬λ λ€νΈμν¬μ μν΄ μλμΌλ‘ λΆλ₯λμμΌλ©°, μ΄ κ³Όμ μμ μ΄λ―Έμ§ λΆν λͺ¨λΈ λν μ μ©λμ΄ λ₯λ¬λ λΆλ₯ λͺ¨λΈμ΄ μ€λ‘μ§ μνΌ λΆλΆμλ§ μ§μ€ν μ μλλ‘ νμλ€. μ
μ§Έ, μ°λ¦¬λ μ°¨λμ νμ¬λ μΉ΄λ©λΌλ₯Ό νμ©νμ¬ λλ‘λ³ κ°λ‘μμ μμ’
μ νμ§νμλ€. μ΄ κ³Όμ μμ μ°¨λμΌλ‘λΆν° κ°λ‘μκΉμ§μ 거리 μ λ³΄κ° νμνμλλ°, μ°λ¦¬λ μ΄λ―Έμ§ μμ μ°¨μ κ°μλ‘λΆν° 거리λ₯Ό μΆμ νμλ€. λ§μ§λ§μΌλ‘, λ³Έ μ°κ΅¬ κ²°κ³Όλ λΌμ΄λ€ (Light Detection and Ranging, LiDAR)μ GPS μ₯λΉ, κ·Έλ¦¬κ³ νμ₯ μλ£μ μν΄ νκ°λμλ€. μ°λ¦¬λ 121.04 kmΒ² λ©΄μ μ λμμ§ λ΄μ μ½ 130λ§μ¬ 그루μ μλͺ©μ΄ μ‘΄μ¬νλ κ²μ νμΈνμμΌλ©°, λ€μν μ νμ°κ΅¬λ³΄λ€ λμ μ νλμ κ°λ³ μλͺ© μμΉ μ§λλ₯Ό μ μνμλ€. νμ§λ λͺ¨λ μλͺ© μ€ μ½ 6λ§ 3μ²μ¬ 그루μ μμ’
μ λ³΄κ° νμ§λμμΌλ©°, μ΄μ€ κ°μ₯ λΉλ²ν νμ§λ μλͺ©μ μλ²λ무 (Prunus yedoensis, 21.43 %)μλ€. μνλ무 (Ginkgo biloba, 19.44 %), λν°λ무 (Zelkova serrata, 18.68 %), μλ무 (Pinus densiflora, 7.55 %), κ·Έλ¦¬κ³ λ©νμΈμΏΌμ΄μ΄ (Metasequoia glyptostroboides, 5.97 %) λ±μ΄ κ·Έ λ€λ₯Ό μ΄μλ€. ν¬κ΄μ μΈ κ²μ¦μ΄ μνλμκ³ , λ³Έ μ°κ΅¬μμλ μλ―Όμ΄ μμ§ν μνΌ μ¬μ§κ³Ό μ°¨λμΌλ‘λΆν° μμ§λ λλ‘λ³ μ΄λ―Έμ§λ λμ μμ’
λΆν¬μ λν μ ννκ³ μ λμ μΈ μ 보λ₯Ό μ 곡νλ€λ κ²μ κ²μ¦νμλ€.1. Introduction 6
2. Methodology 9
2.1. Data collection 9
2.2. Deep learning overall 12
2.3. Tree counting and mapping 15
2.4. Tree species detection 16
2.5. Evaluation 21
3. Results 22
3.1. Evaluation of deep learning performance 22
3.2. Tree counting and mapping 23
3.3. Tree species detection 27
4. Discussion 30
4.1. Multiple sensing platforms for urban areas 30
4.2. Potential of citizen and vehicle sensors 34
4.3. Implications 48
5. Conclusion 51
Bibliography 52
Abstract in Korean 61μ
Archeological Survey For The Temple-Belton Regional Sewer System Improvement Project, Bell County, Texas
Prewitt and Associates, Inc., (PAI) was contracted by Kasberg, Patrick, and Associates to perform an intensive archeological survey prior to the proposed installation of new sewer lines, the expansion of one lift station, and the construction of another lift station in Bell County,Texas.This investigation was conducted in April 2013 in compliance with the Texas Antiquities Code.The Temple-Belton Regional Sewer System (TBRSS) Improvement Project will construct a new 1.7-mile-long (8,730-ft-long) Shallowford force Main sewer line from the Temple-Belton wastewater Treatment Plant on fM 93 to the Shallowford Lift Station just north of the Leon River. The project also calls for small expansions of the Shallowford Lift Station and Belton Lift Station.
The survey recorded two new archeological sites, 41BL1380 and 41BL1381, and revisited one previously recorded site, 41BL260. All three sites are recommended as not eligible for listing in the National Register or for designation as State Antiquities Landmarks.
The TBRSS project was put on hold in 2013 and 2014, and PAI archeologists did not conduct trenching in a 2,788-ft-long section of the force main alignment where the landowner denied right of entry. This segment in the Leon River valley has the potential to contain intact buried archeological remains in holocene-age alluvium. Consequently, this section of the force main alignment will need to be investigated with mechanical trenching if the TBRSS project is resurrected
Geocoding of trees from street addresses and street-level images
We introduce an approach for updating older tree inventories with geographic coordinates using street-level panorama images and a global optimization framework for tree instance matching. Geolocations of trees in inventories until the early 2000s where recorded using street addresses whereas newer inventories use GPS. Our method retrofits older inventories with geographic coordinates to allow connecting them with newer inventories to facilitate long-term studies on tree mortality etc. What makes this problem challenging is the different number of trees per street address, the heterogeneous appearance of different tree instances in the images, ambiguous tree positions if viewed from multiple images and occlusions. To solve this assignment problem, we (i) detect trees in Google street-view panoramas using deep learning, (ii) combine multi-view detections per tree into a single representation, (iii) and match detected trees with given trees per street address with a global optimization approach. Experiments for trees in 5 cities in California, USA, show that we are able to assign geographic coordinates to 38% of the street trees, which is a good starting point for long-term studies on the ecosystem services value of street trees at large scale
Individual Tree Detection in Large-Scale Urban Environments using High-Resolution Multispectral Imagery
We introduce a novel deep learning method for detection of individual trees
in urban environments using high-resolution multispectral aerial imagery. We
use a convolutional neural network to regress a confidence map indicating the
locations of individual trees, which are localized using a peak finding
algorithm. Our method provides complete spatial coverage by detecting trees in
both public and private spaces, and can scale to very large areas. We performed
a thorough evaluation of our method, supported by a new dataset of over 1,500
images and almost 100,000 tree annotations, covering eight cities, six climate
zones, and three image capture years. We trained our model on data from
Southern California, and achieved a precision of 73.6% and recall of 73.3%
using test data from this region. We generally observed similar precision and
slightly lower recall when extrapolating to other California climate zones and
image capture dates. We used our method to produce a map of trees in the entire
urban forest of California, and estimated the total number of urban trees in
California to be about 43.5 million. Our study indicates the potential for deep
learning methods to support future urban forestry studies at unprecedented
scales
Automatic Large Scale Detection of Red Palm Weevil Infestation using Aerial and Street View Images
The spread of the Red Palm Weevil has dramatically affected date growers,
homeowners and governments, forcing them to deal with a constant threat to
their palm trees. Early detection of palm tree infestation has been proven to
be critical in order to allow treatment that may save trees from irreversible
damage, and is most commonly performed by local physical access for individual
tree monitoring. Here, we present a novel method for surveillance of Red Palm
Weevil infested palm trees utilizing state-of-the-art deep learning algorithms,
with aerial and street-level imagery data. To detect infested palm trees we
analyzed over 100,000 aerial and street-images, mapping the location of palm
trees in urban areas. Using this procedure, we discovered and verified infested
palm trees at various locations
- β¦