546 research outputs found

    Identification of Log Characteristics in Computed Tomography Images Using Back-Propagation Neural Networks with the Resilient Back-Propagation Training Algorithm and Textural Analysis: Preliminary Results

    Get PDF
    This research addressed the feasibility of identifying internal log characteristics in computed tomography (CT) images of sugar maple and black spruce logs by means of back-propagation (BP) neural networks with a resilient BP training algorithm. Five CT images were randomly sampled from each log. Three of the images were used to develop the corresponding classifier, and the remaining two images were used for validation. The image features that were used in the classifier were gray-level values, textual, and distance features. The important part of the classifier topology, ie the hidden node number, was determined based on the performance indicators: overall accuracy, mean square error, training iteration number, and training time. For the training images, the classifiers produced class accuracies for heartwood, sapwood, bark, and knots of 99.3, 100, 96.7, and 97.9%, respectively, for the sugar maple log; and 99.7, 95.3, 98.4, and 93.2%, respectively, for the black spruce log. Overall accuracies were 98.5% for sugar maple and 96.6% for black spruce, respectively. High overall accuracies were also achieved with the validation images of both species. The results also suggest that using textural information as the inputs can improve the classification accuracy. Moreover, the resilient BP training algorithm made BP artificial neural networks converge faster compared with the steepest gradient descent with momentum algorithm. This study indicates that the developed BP neural networks may be applicable to identify the internal log characteristics in the CT images of sugar maple and black spruce logs

    Early Detection of Bark Beetle Attack Using Remote Sensing and Machine Learning: A Review

    Full text link
    Bark beetle outbreaks can result in a devastating impact on forest ecosystem processes, biodiversity, forest structure and function, and economies. Accurate and timely detection of bark beetle infestations is crucial to mitigate further damage, develop proactive forest management activities, and minimize economic losses. Incorporating remote sensing (RS) data with machine learning (ML) (or deep learning (DL)) can provide a great alternative to the current approaches that rely on aerial surveys and field surveys, which are impractical over vast geographical regions. This paper provides a comprehensive review of past and current advances in the early detection of bark beetle-induced tree mortality from three key perspectives: bark beetle & host interactions, RS, and ML/DL. We parse recent literature according to bark beetle species & attack phases, host trees, study regions, imagery platforms & sensors, spectral/spatial/temporal resolutions, spectral signatures, spectral vegetation indices (SVIs), ML approaches, learning schemes, task categories, models, algorithms, classes/clusters, features, and DL networks & architectures. This review focuses on challenging early detection, discussing current challenges and potential solutions. Our literature survey suggests that the performance of current ML methods is limited (less than 80%) and depends on various factors, including imagery sensors & resolutions, acquisition dates, and employed features & algorithms/networks. A more promising result from DL networks and then the random forest (RF) algorithm highlighted the potential to detect subtle changes in visible, thermal, and short-wave infrared (SWIR) spectral regions.Comment: Under review, 33 pages, 5 figures, 8 Table

    A Model of Plant Identification System Using GLCM, Lacunarity And Shen Features

    Get PDF
    Recently, many approaches have been introduced by several researchers to identify plants. Now, applications of texture, shape, color and vein features are common practices. However, there are many possibilities of methods can be developed to improve the performance of such identification systems. Therefore, several experiments had been conducted in this research. As a result, a new novel approach by using combination of Gray-Level Co-occurrence Matrix, lacunarity and Shen features and a Bayesian classifier gives a better result compared to other plant identification systems. For comparison, this research used two kinds of several datasets that were usually used for testing the performance of each plant identification system. The results show that the system gives an accuracy rate of 97.19% when using the Flavia dataset and 95.00% when using the Foliage dataset and outperforms other approaches.Comment: 10 page

    A Novel Technique of Error Concealment Method Selection in Texture Images Using ALBP Classifier

    Get PDF
    There are many error concealment techniques for image processing. In the paper, the focus is on restoration of image with missing blocks or macroblocks. Different methods can be optimal for different kinds of images. In recent years, great attention was dedicated to textures, and specific methods were developed for their processing. Many of them use classification of textures as an integral part. It is also of an advantage to know the texture classification to select the best restoration technique. In the paper, selection based on texture classification with advanced local binary patterns and spatial distribution of dominant patterns is proposed. It is shown, that for classified textures, optimal error concealment method can be selected from predefined ones, resulting then in better restoration. For testing, three methods of extrapolation and texture synthesis were used

    Supervised learning with hybrid global optimisation methods

    Get PDF

    Artificial Neural Network Prosperities in Textile Applications

    Get PDF

    Texture recognition by using artificial neural network

    Get PDF
    This thesis describes the texture recognition by using the Artificial Neural Network (ANN). There are hard to understand on how to perform the texture recognition on any new set of image data. Therefore, to ease up the process on texture recognition, ANN has been chosen as the classifier to enhance the process of the texture recognition. There are thirteen types of Brodatz textures are considered as the dataset for this research and five sets for each type texture with different level of histogram equalized, noise for the training dataset. Backpropagation algorithm is one of the methods for the ANN. After the feature is obtained from the dataset, the feature will be trained and classifier by using theBack-propagation algorithm. All in all, this project will tell us how the Back-propagation classifier help in texture recognition and how to increases the success rate in texture recognition

    닀쀑 μ„Όμ‹± ν”Œλž«νΌκ³Ό λ”₯λŸ¬λ‹μ„ ν™œμš©ν•œ λ„μ‹œ 규λͺ¨μ˜ 수λͺ© 맡핑 및 μˆ˜μ’… 탐지

    Get PDF
    ν•™μœ„λ…Όλ¬Έ(석사) -- μ„œμšΈλŒ€ν•™κ΅λŒ€ν•™μ› : 농업생λͺ…κ³Όν•™λŒ€ν•™ μƒνƒœμ‘°κ²½Β·μ§€μ—­μ‹œμŠ€ν…œκ³΅ν•™λΆ€(μƒνƒœμ‘°κ²½ν•™), 2023. 2. λ₯˜μ˜λ ¬.Precise estimation of the number of trees and individual tree location with species information all over the city forms solid foundation for enhancing ecosystem service. However, mapping individual trees at the city scale remains challenging due to heterogeneous patterns of urban tree distribution. Here, we present a novel framework for merging multiple sensing platforms with leveraging various deep neural networks to produce a fine-grained urban tree map. We performed mapping trees and detecting species by relying only on RGB images taken by multiple sensing platforms such as airborne, citizens and vehicles, which fueled six deep learning models. We divided the entire process into three steps, since each platform has its own strengths. First, we produced individual tree location maps by converting the central points of the bounding boxes into actual coordinates from airborne imagery. Since many trees were obscured by the shadows of the buildings, we applied Generative Adversarial Network (GAN) to delineate hidden trees from the airborne images. Second, we selected tree bark photos collected by citizen for species mapping in urban parks and forests. Species information of all tree bark photos were automatically classified after non-tree parts of images were segmented. Third, we classified species of roadside trees by using a camera mounted on a car to augment our species mapping framework with street-level tree data. We estimated the distance from a car to street trees from the number of lanes detected from the images. Finally, we assessed our results by comparing it with Light Detection and Ranging (LiDAR), GPS and field data. We estimated over 1.2 million trees existed in the city of 121.04 kmΒ² and generated more accurate individual tree positions, outperforming the conventional field survey methods. Among them, we detected the species of more than 63,000 trees. The most frequently detected species was Prunus yedoensis (21.43 %) followed by Ginkgo biloba (19.44 %), Zelkova serrata (18.68 %), Pinus densiflora (7.55 %) and Metasequoia glyptostroboides (5.97 %). Comprehensive experimental results demonstrate that tree bark photos and street-level imagery taken by citizens and vehicles are conducive to delivering accurate and quantitative information on the distribution of urban tree species.λ„μ‹œ 전역에 μ‘΄μž¬ν•˜λŠ” λͺ¨λ“  수λͺ©μ˜ μˆ«μžμ™€ κ°œλ³„ μœ„μΉ˜, 그리고 μˆ˜μ’… 뢄포λ₯Ό μ •ν™•ν•˜κ²Œ νŒŒμ•…ν•˜λŠ” 것은 μƒνƒœκ³„ μ„œλΉ„μŠ€λ₯Ό ν–₯μƒμ‹œν‚€κΈ° μœ„ν•œ ν•„μˆ˜μ‘°κ±΄μ΄λ‹€. ν•˜μ§€λ§Œ, λ„μ‹œμ—μ„œλŠ” 수λͺ©μ˜ 뢄포가 맀우 λ³΅μž‘ν•˜κΈ° λ•Œλ¬Έμ— κ°œλ³„ 수λͺ©μ„ λ§΅ν•‘ν•˜λŠ” 것은 μ–΄λ €μ› λ‹€. λ³Έ μ—°κ΅¬μ—μ„œλŠ”, μ—¬λŸ¬κ°€μ§€ μ„Όμ‹± ν”Œλž«νΌμ„ μœ΅ν•©ν•¨κ³Ό λ™μ‹œμ— λ‹€μ–‘ν•œ λ”₯λŸ¬λ‹ λ„€νŠΈμ›Œν¬λ“€μ„ ν™œμš©ν•˜μ—¬ μ„Έλ°€ν•œ λ„μ‹œ 수λͺ© 지도λ₯Ό μ œμž‘ν•˜λŠ” μƒˆλ‘œμš΄ ν”„λ ˆμž„μ›Œν¬λ₯Ό μ œμ•ˆν•œλ‹€. μš°λ¦¬λŠ” 였직 항곡사진, μ‹œλ―Ό, μ°¨λŸ‰ λ“±μ˜ ν”Œλž«νΌμœΌλ‘œλΆ€ν„° μˆ˜μ§‘λœ RGB μ΄λ―Έμ§€λ§Œμ„ μ‚¬μš©ν•˜μ˜€μœΌλ©°, 6가지 λ”₯λŸ¬λ‹ λͺ¨λΈμ„ ν™œμš©ν•˜μ—¬ 수λͺ©μ„ λ§΅ν•‘ν•˜κ³  μˆ˜μ’…μ„ νƒμ§€ν•˜μ˜€λ‹€. 각각의 ν”Œλž«νΌμ€ μ €λ§ˆλ‹€μ˜ 강점이 있기 λ•Œλ¬Έμ— μ „ 과정을 μ„Έ 가지 μŠ€ν…μœΌλ‘œ ꡬ뢄할 수 μžˆλ‹€. 첫째, μš°λ¦¬λŠ” 항곡사진 μƒμ—μ„œ νƒμ§€λœ 수λͺ©μ˜ λ”₯λŸ¬λ‹ λ°”μš΄λ”© λ°•μŠ€λ‘œλΆ€ν„° 쀑심점을 μΆ”μΆœν•˜μ—¬ κ°œλ³„ 수λͺ©μ˜ μœ„μΉ˜ 지도λ₯Ό μ œμž‘ν•˜μ˜€λ‹€. λ§Žμ€ 수λͺ©μ΄ λ„μ‹œ λ‚΄ κ³ μΈ΅ λΉŒλ”©μ˜ κ·Έλ¦Όμžμ— μ˜ν•΄ κ°€λ €μ‘ŒκΈ° λ•Œλ¬Έμ—, μš°λ¦¬λŠ” 생정적 μ λŒ€μ  신경망 (Generative Adversarial Network, GAN)을 톡해 항곡사진 상에 μˆ¨κ²¨μ§„ 수λͺ©μ„ κ·Έλ €λ‚΄κ³ μž ν•˜μ˜€λ‹€. λ‘˜μ§Έ, μš°λ¦¬λŠ” μ‹œλ―Όλ“€μ΄ μˆ˜μ§‘ν•œ 수λͺ©μ˜ μˆ˜ν”Ό 사진을 ν™œμš©ν•˜μ—¬ λ„μ‹œ 곡원 및 λ„μ‹œ 숲 μΌλŒ€μ— μˆ˜μ’… 정보λ₯Ό λ§΅ν•‘ν•˜μ˜€λ‹€. μˆ˜ν”Ό μ‚¬μ§„μœΌλ‘œλΆ€ν„°μ˜ μˆ˜μ’… μ •λ³΄λŠ” λ”₯λŸ¬λ‹ λ„€νŠΈμ›Œν¬μ— μ˜ν•΄ μžλ™μœΌλ‘œ λΆ„λ₯˜λ˜μ—ˆμœΌλ©°, 이 κ³Όμ •μ—μ„œ 이미지 λΆ„ν•  λͺ¨λΈ λ˜ν•œ μ μš©λ˜μ–΄ λ”₯λŸ¬λ‹ λΆ„λ₯˜ λͺ¨λΈμ΄ μ˜€λ‘œμ§€ μˆ˜ν”Ό λΆ€λΆ„μ—λ§Œ 집쀑할 수 μžˆλ„λ‘ ν•˜μ˜€λ‹€. μ…‹μ§Έ, μš°λ¦¬λŠ” μ°¨λŸ‰μ— νƒ‘μž¬λœ 카메라λ₯Ό ν™œμš©ν•˜μ—¬ λ„λ‘œλ³€ κ°€λ‘œμˆ˜μ˜ μˆ˜μ’…μ„ νƒμ§€ν•˜μ˜€λ‹€. 이 κ³Όμ •μ—μ„œ μ°¨λŸ‰μœΌλ‘œλΆ€ν„° κ°€λ‘œμˆ˜κΉŒμ§€μ˜ 거리 정보가 ν•„μš”ν•˜μ˜€λŠ”λ°, μš°λ¦¬λŠ” 이미지 μƒμ˜ μ°¨μ„  κ°œμˆ˜λ‘œλΆ€ν„° 거리λ₯Ό μΆ”μ •ν•˜μ˜€λ‹€. λ§ˆμ§€λ§‰μœΌλ‘œ, λ³Έ 연ꡬ κ²°κ³ΌλŠ” 라이닀 (Light Detection and Ranging, LiDAR)와 GPS μž₯λΉ„, 그리고 ν˜„μž₯ μžλ£Œμ— μ˜ν•΄ ν‰κ°€λ˜μ—ˆλ‹€. μš°λ¦¬λŠ” 121.04 kmΒ² 면적의 λŒ€μƒμ§€ 내에 μ•½ 130λ§Œμ—¬ 그루의 수λͺ©μ΄ μ‘΄μž¬ν•˜λŠ” 것을 ν™•μΈν•˜μ˜€μœΌλ©°, λ‹€μ–‘ν•œ 선행연ꡬ보닀 높은 μ •ν™•λ„μ˜ κ°œλ³„ 수λͺ© μœ„μΉ˜ 지도λ₯Ό μ œμž‘ν•˜μ˜€λ‹€. νƒμ§€λœ λͺ¨λ“  수λͺ© 쀑 μ•½ 6만 3μ²œμ—¬ 그루의 μˆ˜μ’… 정보가 νƒμ§€λ˜μ—ˆμœΌλ©°, 이쀑 κ°€μž₯ 빈번히 νƒμ§€λœ 수λͺ©μ€ μ™•λ²šλ‚˜λ¬΄ (Prunus yedoensis, 21.43 %)μ˜€λ‹€. μ€ν–‰λ‚˜λ¬΄ (Ginkgo biloba, 19.44 %), λŠν‹°λ‚˜λ¬΄ (Zelkova serrata, 18.68 %), μ†Œλ‚˜λ¬΄ (Pinus densiflora, 7.55 %), 그리고 메타세쿼이어 (Metasequoia glyptostroboides, 5.97 %) 등이 κ·Έ λ’€λ₯Ό μ΄μ—ˆλ‹€. 포괄적인 검증이 μˆ˜ν–‰λ˜μ—ˆκ³ , λ³Έ μ—°κ΅¬μ—μ„œλŠ” μ‹œλ―Όμ΄ μˆ˜μ§‘ν•œ μˆ˜ν”Ό 사진과 μ°¨λŸ‰μœΌλ‘œλΆ€ν„° μˆ˜μ§‘λœ λ„λ‘œλ³€ μ΄λ―Έμ§€λŠ” λ„μ‹œ μˆ˜μ’… 뢄포에 λŒ€ν•œ μ •ν™•ν•˜κ³  μ •λŸ‰μ μΈ 정보λ₯Ό μ œκ³΅ν•œλ‹€λŠ” 것을 κ²€μ¦ν•˜μ˜€λ‹€.1. Introduction 6 2. Methodology 9 2.1. Data collection 9 2.2. Deep learning overall 12 2.3. Tree counting and mapping 15 2.4. Tree species detection 16 2.5. Evaluation 21 3. Results 22 3.1. Evaluation of deep learning performance 22 3.2. Tree counting and mapping 23 3.3. Tree species detection 27 4. Discussion 30 4.1. Multiple sensing platforms for urban areas 30 4.2. Potential of citizen and vehicle sensors 34 4.3. Implications 48 5. Conclusion 51 Bibliography 52 Abstract in Korean 61석

    Dean Diepeveen survey of computer-based vision systems for automatic identification of plant species

    Get PDF

    Classification system for rain fed wheat grain cultivars using artificial neural network

    Get PDF
    Artificial neural network (ANN) models have found wide applications, including prediction, classification, system modeling and image processing. Image analysis based on texture, morphology and color features of grains is essential for various applications as wheat grain industry and cultivation. In order to classify the rain fed wheat cultivars using artificial neural network with different neurons number of hidden layers, this study was done in Islamic Azad University, Shahr-e-Rey Branch, during 2010 on 6 main rain fed wheat cultivars grown in different environments of Iran. Firstly, data on 6 colors, 11 morphological features and 4 shape factors were extracted, then these candidated features fed Multilayer Perceptron (MLP) neural network. The topological structure of this MLP model consisted of 21 neurons in the input layer, 6 neurons (Sardari, Sardari 39, Zardak, Azar 2, ABR1 and Ohadi) in the output layer and two hidden layers with different neurons number (21-30-10-6, 21-30-20-6 and 21-30-30-6). Finally, accuracy average for classification of rain fed wheat grains cultivars computed 86.48% and after feature selection application with UTA algorithm increased to 87.22% in 21-30-20-6 structure. The results indicate that the combination of ANN, image analysis and the optimum model architecture 21-30-20-6 had excellent potential for cultivars classification.Key words: Rain fed wheat, grain, artificial neural networks (ANNs),Β  multilayer perceptron (MLP), feature selection
    • …
    corecore