1,691 research outputs found

    Mosquito Detection with Neural Networks: The Buzz of Deep Learning

    Full text link
    Many real-world time-series analysis problems are characterised by scarce data. Solutions typically rely on hand-crafted features extracted from the time or frequency domain allied with classification or regression engines which condition on this (often low-dimensional) feature vector. The huge advances enjoyed by many application domains in recent years have been fuelled by the use of deep learning architectures trained on large data sets. This paper presents an application of deep learning for acoustic event detection in a challenging, data-scarce, real-world problem. Our candidate challenge is to accurately detect the presence of a mosquito from its acoustic signature. We develop convolutional neural networks (CNNs) operating on wavelet transformations of audio recordings. Furthermore, we interrogate the network's predictive power by visualising statistics of network-excitatory samples. These visualisations offer a deep insight into the relative informativeness of components in the detection problem. We include comparisons with conventional classifiers, conditioned on both hand-tuned and generic features, to stress the strength of automatic deep feature learning. Detection is achieved with performance metrics significantly surpassing those of existing algorithmic methods, as well as marginally exceeding those attained by individual human experts.Comment: For data and software related to this paper, see http://humbug.ac.uk/kiskin2017/. Submitted as a conference paper to ECML 201

    Deep Multibranch Fusion Residual Network for Insect Pest Recognition

    Get PDF
    Earlier insect pest recognition is one of the critical factors for agricultural yield. Thus, an effective method to recognize the category of insect pests has become significant issues in the agricultural field. In this paper, we proposed a new residual block to learn multi-scale representation. In each block, it contains three branches: one is parameter-free, and the others contain several successive convolution layers. Moreover, we proposed a module and embedded it into the new residual block to recalibrate the channel-wise feature response and to model the relationship of the three branches. By stacking this kind of block, we constructed the Deep Multi-branch Fusion Residual Network (DMF-ResNet). For evaluating the model performance, we first test our model on CIFAR-10 and CIFAR-100 benchmark datasets. The experimental results show that DMF-ResNet outperforms the baseline models significantly. Then, we construct DMF-ResNet with different depths for high-resolution image classification tasks and apply it to recognize insect pests. We evaluate the model performance on the IP102 dataset, and the experimental results show that DMF-ResNet could achieve the best accuracy performance than the baseline models and other state-of-art methods. Based on these empirical experiments, we demonstrate the effectiveness of our approach

    FR-ResNet s for Insect Pest Recognition

    Get PDF
    Insect pests are one of the main threats to the commercially important crops. An effective insect pest recognition method can avoid economic losses. In this paper, we proposed a new and simple structure based on the original residual block and named as feature reuse residual block which combines feature from the input signal of a residual block with the residual signal. In each feature reuse residual block, it enhances the capacity of representation by learning half and reuse half feature. By stacking the feature reuse residual block, we obtained the feature reuse residual network (FR-ResNet) and evaluated the performance on IP102 benchmark dataset. The experimental results showed that FR-ResNet can achieve significant performance improvement in terms of insect pest classification. Moreover, to demonstrate the adaptive of our approach, we applied it to various kinds of residual networks, including ResNet, Pre-ResNet, and WRN, and we tested the performance on a series of benchmark datasets: CIFAR-10, CIFAR-100, and SVHN. The experimental results showed that the performance can be improved obviously than original networks. Based on these experiments on CIFAR-10, CIFAR-100, SVHN, and IP102 benchmark datasets, it demonstrates the effectiveness of our approach

    A Crop Pests Image Classification Algorithm Based on Deep Convolutional Neural Network

    Get PDF
    Conventional pests image classification methods may not be accurate due to the complex farmland background, sunlight and pest gestures. To raise the accuracy, the deep convolutional neural network (DCNN), a concept from Deep Learning, was used in this study to classify crop pests image. On the ground of our experiments, in which LeNet-5 and AlexNet were used to classify pests image, we have analyzed the effects of both convolution kernel and the number of layers on the network, and redesigned the structure of convolutional neural network for crop pests. Further more, 82 common pest types have been classified, with the accuracy reaching 91%. The comparison to conventional classification methods proves that our method is not only feasible but preeminent

    DFF-ResNet : An Insect Pest Recognition Model Based on Residual Networks

    Get PDF
    Insect pest control is considered as a significant factor in the yield of commercial crops. Thus, to avoid economic losses, we need a valid method for insect pest recognition. In this paper, we proposed a feature fusion residual block to perform the insect pest recognition task. Based on the original residual block, we fused the feature from a previous layer between two 1×1 convolution layers in a residual signal branch to improve the capacity of the block. Furthermore, we explored the contribution of each residual group to the model performance. We found that adding the residual blocks of earlier residual groups promotes the model performance significantly, which improves the capacity of generalization of the model. By stacking the feature fusion residual block, we constructed the Deep Feature Fusion Residual Network (DFF-ResNet). To prove the validity and adaptivity of our approach, we constructed it with two common residual networks (Pre-ResNet and Wide Residual Network (WRN)) and validated these models on the Canadian Institute For Advanced Research (CIFAR) and Street View House Number (SVHN) benchmark datasets. The experimental results indicate that our models have a lower test error than those of baseline models. Then, we applied our models to recognize insect pests and obtained validity on the IP102 benchmark dataset. The experimental results show that our models outperform the original ResNet and other state-of-the-art methods

    An Efficient Deep Learning-based approach for Recognizing Agricultural Pests in the Wild

    Full text link
    One of the biggest challenges that the farmers go through is to fight insect pests during agricultural product yields. The problem can be solved easily and avoid economic losses by taking timely preventive measures. This requires identifying insect pests in an easy and effective manner. Most of the insect species have similarities between them. Without proper help from the agriculturist academician it is very challenging for the farmers to identify the crop pests accurately. To address this issue we have done extensive experiments considering different methods to find out the best method among all. This paper presents a detailed overview of the experiments done on mainly a robust dataset named IP102 including transfer learning with finetuning, attention mechanism and custom architecture. Some example from another dataset D0 is also shown to show robustness of our experimented techniques

    Unifying and Merging Well-trained Deep Neural Networks for Inference Stage

    Full text link
    We propose a novel method to merge convolutional neural-nets for the inference stage. Given two well-trained networks that may have different architectures that handle different tasks, our method aligns the layers of the original networks and merges them into a unified model by sharing the representative codes of weights. The shared weights are further re-trained to fine-tune the performance of the merged model. The proposed method effectively produces a compact model that may run original tasks simultaneously on resource-limited devices. As it preserves the general architectures and leverages the co-used weights of well-trained networks, a substantial training overhead can be reduced to shorten the system development time. Experimental results demonstrate a satisfactory performance and validate the effectiveness of the method.Comment: To appear in the 27th International Joint Conference on Artificial Intelligence and the 23rd European Conference on Artificial Intelligence, 2018. (IJCAI-ECAI 2018

    Comparison of CNN Models With Transfer Learning in the Classification of Insect Pests

    Get PDF
    Insect pests are an important problem to overcome in agriculture. The purpose of this research is to classify insect pests with the IP-102 dataset using several CNN pre-trained models and choose which model is best for classifying insect pest data. The method used is the transfer learning method with a fine-tuning approach. Transfer learning was chosen because this technique can use the features and weights that have been obtained during the previous training process. Thus, computation time can be reduced and accuracy can be increased. The models used include Xception, MobileNetV3L, MobileNetV2, DenseNet-201, and InceptionV3. Fine-tuning and freeze layer techniques are also used to improve the quality of the resulting model, making it more accurate and better suited to the problem at hand. This study uses 75,222 image data with 102 classes. The results of this study are the DenseNet-201 model with fine-tuning produces an accuracy value of 70%, MobileNetV2 66%, MobileNetV3L 68%, InceptionV3 67%, Xception 69%. The conclusion of this study is that the transfer learning method with the fine-tuning approach produces the highest accuracy value of 70% in the DenseNet-201 model
    • …
    corecore