60 research outputs found
Recommended from our members
Weed Mapping with UAS Imagery and a Bag of Visual Words Based Image Classifier
Weed detection with aerial images is a great challenge to generate field maps for site-specific plant protection application. The requirements might be met with low altitude flights of unmanned aerial vehicles (UAV), to provide adequate ground resolutions for differentiating even single weeds accurately. The following study proposed and tested an image classifier based on a Bag of Visual Words (BoVW) framework for mapping weed species, using a small unmanned aircraft system (UAS) with a commercial camera on board, at low flying altitudes. The image classifier was trained with support vector machines after building a visual dictionary of local features from many collected UAS images. A window-based processing of the models was used for mapping the weed occurrences in the UAS imagery. The UAS flight campaign was carried out over a weed infested wheat field, and images were acquired between a 1 and 6 m flight altitude. From the UAS images, 25,452 weed plants were annotated on species level, along with wheat and soil as background classes for training and validation of the models. The results showed that the BoVW model allowed the discrimination of single plants with high accuracy for Matricaria recutita L. (88.60%), Papaver rhoeas L. (89.08%), Viola arvensis M. (87.93%), and winter wheat (94.09%), within the generated maps. Regarding site specific weed control, the classified UAS images would enable the selection of the right herbicide based on the distribution of the predicted weed species. © 2018 by the authors
Recommended from our members
Rapid and low-cost insect detection for analysing species trapped on yellow sticky traps
While insect monitoring is a prerequisite for precise decision-making regarding integrated pest management (IPM), it is time- and cost-intensive. Low-cost, time-saving and easy-to-operate tools for automated monitoring will therefore play a key role in increased acceptance and application of IPM in practice. In this study, we tested the differentiation of two whitefly species and their natural enemies trapped on yellow sticky traps (YSTs) via image processing approaches under practical conditions. Using the bag of visual words (BoVW) algorithm, accurate differentiation between both natural enemies and the Trialeurodes vaporariorum and Bemisia tabaci species was possible, whereas the procedure for B. tabaci could not be used to differentiate this species from T. vaporariorum. The decay of species was considered using fresh and aged catches of all the species on the YSTs, and different pooling scenarios were applied to enhance model performance. The best performance was reached when fresh and aged individuals were used together and the whitefly species were pooled into one category for model training. With an independent dataset consisting of photos from the YSTs that were placed in greenhouses and consequently with a naturally occurring species mixture as the background, a differentiation rate of more than 85% was reached for natural enemies and whiteflies
Recommended from our members
Growth Height Determination of Tree Walls for Precise Monitoring in Apple Fruit Production Using UAV Photogrammetry
In apple cultivation, spatial information about phenotypic characteristics of tree walls would be beneficial for precise orchard management. Unmanned aerial vehicles (UAVs) can collect 3D structural information of ground surface objects at high resolution in a cost-effective and versatile way by using photogrammetry. The aim of this study is to delineate tree wall height information in an apple orchard applying a low-altitude flight pattern specifically designed for UAVs. This flight pattern implies small distances between the camera sensor and the tree walls when the camera is positioned in an oblique view toward the trees. In this way, it is assured that the depicted tree crown wall area will be largely covered with a larger ground sampling distance than that recorded from a nadir perspective, especially regarding the lower crown sections. Overlapping oblique view images were used to estimate 3D point cloud models by applying structure-from-motion (SfM) methods to calculate tree wall heights from them. The resulting height models were compared with ground-based light detection and ranging (LiDAR) data as reference. It was shown that the tree wall profiles from the UAV point clouds were strongly correlated with the LiDAR point clouds of two years (2018: R2 = 0.83; 2019: R2 = 0.88). However, underestimation of tree wall heights was detected with mean deviations of â0.11 m and â0.18 m for 2018 and 2019, respectively. This is attributed to the weaknesses of the UAV point clouds in resolving the very fine shoots of apple trees. Therefore, the shown approach is suitable for precise orchard management, but it underestimated vertical tree wall expanses, and widened tree gaps need to be accounted for
Recommended from our members
Spectral shift as advanced index for fruit chlorophyll breakdown
The decline of fruit chlorophyll is a valuable indicator of fruit ripeness. Fruit chlorophyll content can be nondestructively estimated by UV/VIS spectroscopy at fixed wavelengths. However, this approach cannot explain the complex changes in chlorophyll catabolism during fruit ripening. We introduce the apparent peak position of the red band chlorophyll absorption as a new qualitative spectral indicator. Climacteric fruit (apple: nâ=â24, mango: nâ=â38, tomato: nâ=â48) were analysed at different ripeness stages. The peak position and corresponding intensity values were determined between 650 and 690 nm of nondestructively measured fruit spectra as well as of corresponding spectra of fruit extracts. In the extracts, individual contents of chlorophyll a, chlorophyll b, pheophytin a and carotenoids were analysed photometrically, using an established iterative multiple linear regression approach. Nondestructively measured peak positions shifted unimodal in all three fruit species with significant shifts between fruit ripeness classes of maximal 2.00â±â0.27 nm (mean ± standard error) in tomato and 0.57â±â0.11 nm in apple. Peak positions in extract spectra were related to varying pigment ratios (Rmaxâ=ââ0.91), considering individual pigments in the pool. The peak intensities in both spectral readings, nondestructive and fruit extracts, were correlated with absolute chlorophyll contents with Rmaxâ=ââ0.84 and Rmaxâ=â1.00, respectively. The introduced spectral marker of the apparent peak position of chlorophyll absorbance bears the potential for an advanced information gain from nondestructive spectra for the determination of fruit ripeness
Automatisierte Unkrauterkennung auf dem Acker â Möglichkeiten und Grenzen
Unbemannte FluggerĂ€te, unmanned aerial vehicles (UAV), sind inzwischen allgegenwĂ€rtig genutzte Werkzeuge, um hochauflösende rĂ€umliche Informationen landwirtschaftlicher KulturflĂ€chen zu generieren. Ihr Einsatz zur Vegetations-Naherkundung eröffnet dabei eine Reihe an Möglichkeiten, die im Bereich des teilflĂ€chenspezifischen Pflanzenschutzes kĂŒnftig eine wesentliche Rolle spielen werden. In einer zunehmend prĂ€zisierten Landwirtschaft steigt zudem das Interesse an innovativen Technologien und es ist erkennbar, dass sich der Fokus von der landwirtschaftlichen Forschung allmĂ€hlich in die praktische Anwendung verschiebt. Dies erfordert eine rasche und intensive Beurteilung von Möglichkeiten und Grenzen von UAV-gestĂŒtzten Verfahren.WĂ€hrend spektral auflösende Sensoren an UAV zur Kartierung des NĂ€hrstoff- oder Wasserbedarfs einer Kultur pro FlĂ€cheneinheit bereits eingesetzt werden, ist die bildgebende Erkennung von UnkrĂ€utern aus der Luft ungleich komplexer und daher bisher nicht praxisrelevant. Zum einen sind die spektralen Unterschiede zwischen UnkrĂ€utern und Kulturpflanzen zu gering fĂŒr eine sichere Unterscheidung und objektbasierte AnsĂ€tze separieren Pflanzen bisher weder artspezifisch noch sind sie ausreichend an morphologische VerĂ€nderungen innerhalb der Entwicklungsstadien angepasst. Zum anderen fehlen verlĂ€ssliche Lösungen fĂŒr eine stabile Kleinraumnavigation, wie sie fĂŒr eine optisch hinreichende Abbildungsleistung in konstanter niedriger Flughöhe bei unterschiedlichen GelĂ€ndeprofilen erforderlich ist.Zur AbschĂ€tzung der Möglichkeiten und Grenzen einer automatisierten Unkrauterkennung, hinsichtlich der notwendigen Abbildungsleistung als Voraussetzung fĂŒr eine automatisierte Unkrauterkennung, erfolgten MessflĂŒge mit einem Hexakopter in 5 m Flughöhe ĂŒber unterschiedlich verunkrauteten AckerflĂ€chen. Die Flughöhe wurde mittels GPS-gesteuerten Autopiloten gehalten. Luftbildaufnahmen erfolgten ĂŒber georeferenzierten Punkten an denen zeitgleich manuell bonitiert wurde. Die erforderliche optische Auflösung am Boden (mm/Pixel) wurde durch manuelle AuszĂ€hlung der UnkrĂ€uter am PC und Vergleich mit den Boniturdaten abgeschĂ€tzt. Automated weed detection in the field - possibilities and limitsUnmanned Aerial Vehicles (UAV) have become omnipresent and adequate tools to generate high-resolution spatial data of agricultural cropland. Their implementation into remote sensing approaches of weeds provides suitable applications for a site-specific herbicide management. In general, an increasingly use of innovative technologies gradually leads from agricultural research into the practical application. This requires an evaluation of possibilities and limits of UAV-based remote sensing procedures.While spectrals from UAVs are being used already for mapping needs of nutrient or water, the image supported weed detection is much more complex and at the moment not relevant in practice.In this regard, there is a lack of weed and crop differentiation through spectral analyses and object-based approaches separate different plants not species-specific or are not adapted to morphologic changes of the growth. Moreover, there is a need for alternative positioning techniques without GPS, as it is required for a precise optical imaging analysis at low altitudes.To evaluate the possibilities and limitations of automated weed identification regarding the optical and sampling requirements, flights were carried out with a hexacopter at an altitude of 5 m over agricultural crop land with variable weed patches. The altitude was controlled by the GPS-autopilot. Images were captured at geo-referenced points and the number of different weed species was simultaneously determined by manually counting. The required optical resolution on the ground was estimated by comparing the number of weeds between image analysis on the PC and with the field rating data
Recommended from our members
Optimized Deep Learning Model as a Basis for Fast UAV Mapping of Weed Species in Winter Wheat Crops
Weed maps should be available quickly, reliably, and with high detail to be useful for site-specific management in crop protection and to promote more sustainable agriculture by reducing pesticide use. Here, the optimization of a deep residual convolutional neural network (ResNet-18) for the classification of weed and crop plants in UAV imagery is proposed. The target was to reach sufficient performance on an embedded system by maintaining the same features of the ResNet-18 model as a basis for fast UAV mapping. This would enable online recognition and subsequent mapping of weeds during UAV flying operation. Optimization was achieved mainly by avoiding redundant computations that arise when a classification model is applied on overlapping tiles in a larger input image. The model was trained and tested with imagery obtained from a UAV flight campaign at low altitude over a winter wheat field, and classification was performed on species level with the weed species Matricaria chamomilla L., Papaver rhoeas L., Veronica hederifolia L., and Viola arvensis ssp. arvensis observed in that field. The ResNet-18 model with the optimized image-level prediction pipeline reached a performance of 2.2 frames per second with an NVIDIA Jetson AGX Xavier on the full resolution UAV image, which would amount to about 1.78 ha hâ1 area output for continuous field mapping. The overall accuracy for determining crop, soil, and weed species was 94%. There were some limitations in the detection of species unknown to the model. When shifting from 16-bit to 32-bit model precision, no improvement in classification accuracy was observed, but a strong decline in speed performance, especially when a higher number of filters was used in the ResNet-18 model. Future work should be directed towards the integration of the mapping process on UAV platforms, guiding UAVs autonomously for mapping purpose, and ensuring the transferability of the models to other crop fields
Drone based weed monitoring with an image feature classifier
FĂŒr ein teilflĂ€chenspezifisches Unkrautmanagement sind Informationen ĂŒber die Anzahl und die Verteilung verschiedener Unkrautarten auf einer FlĂ€cheneinheit erforderlich. Ist diese Voraussetzung erfĂŒllt, kann die Applikation von Herbiziden hinsichtlich Aufwandmenge und Herbizidwahl an rĂ€umlich variable Unkrautsituationen landwirtschaftlicher FlĂ€chen angepasst werden. Neben einer online-Erfassung am Traktor oder Feldspritze werden kĂŒnftig autonom fliegende Sensorplattformen eingesetzt, deren hochauflösende Luftbildaufnahmen Basis sind fĂŒr die Generierung von artspezifischen Unkrautkarten. Damit wĂŒrden ausreichend Informationen zur VerfĂŒgung stehen, um Aufwandmengen fĂŒr Pflanzenschutzmittel bereits vor der Applikation exakt zu ermitteln und Restmengen zu reduzieren.FĂŒr die Unkrauterkennung selbst werden zunehmend Methoden des maschinellen Lernens adaptiert, die eine objektbasierte Klassifikation anhand eindeutiger Merkmale vieler Unkrautarten weiter voranbringt. WĂ€hrend spektral-optische Klassifikatoren bereits intensiv genutzt werden, um variable NĂ€hrstoff- und Wasserdefizite rĂ€umlich auflösen, hat die objektbasierte Klassifikation fĂŒr eine artspezifische Unterscheidung von LeitunkrĂ€utern ihr volles Potential bisher noch nicht erreicht.In der vorliegenden Studie wurde ein neuer Ansatz objekt-basierter Unkrauterkennung getestet. Die Klassifikation unterschiedlicher Pflanzenarten erfolgte mit dem Bag-of-visual-Word (BoVW) Ansatz auf der Basis hochauflösender Luftbildaufnahmen von autonomen Luftfahrtzeugen (UAV). BoVW ist ein objektbasierter Klassifizierer der bereits seit einiger Zeit in der landwirtschaftlichen Forschung diskutiert wird. Die Ergebnisse zeigen, dass der BoVW-Ansatz eine artspezifische Unterscheidung zwischen Matricaria recutita L. und Papaver rhoeas L. mit guter Erkennungsleistung ermöglicht, wenn parallel eine objekt-basierte Klassifizierung der Kulturpflanzen (Triticum aestivum L.) und Boden erfolgt. FĂŒr die Erstellung praxisrelevanter Unkrautkarten als Basis fĂŒr eine kĂŒnftige teilflĂ€chenspezifische Herbizidapplikation mĂŒssen noch weitere Unkrautarten in den Klassifikator integriert werden. Hierzu erfolgen derzeit weitere Untersuchungen.Site specific weed management needs detailed weed information down to the species level. Then herbicides can be used more specifically according to weed occurrence and their spatial distribution. The accurate identification of weeds is one of the major prerequisites to generate weed maps. Next to predominant implementations of online monitoring approaches on agricultural machinery, unmanned aerial vehicles (UAV) platforms will be used in future to generate weed maps of different species by using high-resolution imagery. While colour-based indices are already applied for mapping nutritional deficits or water deficiency, they have failed to identify different weed species. In contrast, object-based image analysis looks much more promising to separate plant characteristics by means of form and morphology yet are much more complex.This study proposes a new computer vision approach to discriminate weed species based on a bag-of-visualword (BoVW) framework using high resolution aerial images. BoVW is an object-based image classifier that has recently gained interest in agricultural research. In our trials this technology has been applied in laboratory tests and field trials for automatic weed sampling with digital cameras.The results showed that the BoVW model allows the discrimination between Matricaria recutita L., Triticum aestivum L., Papaver rhoeas L. and soil with good accuracy. For providing consistent weed maps in terms of precise herbicide applications in the future, the robustness of the classifier must be evaluated with more crops and weed species acknowledging the natural plant variability observed in the fields
Recommended from our members
Erkennung der Kirschfruchtfliege (Rhagoletis cerasi L.) in Bildern von Gelbtafel-Klebefallen mit Methoden des Deep Learning
Insect populations appear with a high spatial, temporal and type-specific diversity in orchards. One of the many monitoring tools for pest management is the manual assessment of sticky traps. However, this type of assessment is laborious and time-consuming so that only a few locations can be controlled in an orchard. The aim of this study is to test state-of-the art object detection algorithms from deep learning to automatically detect cherry fruit flies (Rhagoletis cerasi), a common insect pest in cherry plantations, within images from yellow sticky traps. An image annotation database was built with images taken from yellow sticky traps with more than 1600 annotated cherry fruit flies. For better handling in the computational algorithms, the images were augmented to smaller ones by the known image preparation methods âflippingâ and âcroppingâ before performing the deep learning. Five deep learning image recognition models were tested including Faster Region-based Convolutional Neural Network (R-CNN) with two different methods of pretraining, Single Shot Detector (SSD), RetinaNet, and You Only Look Once version 5 (YOLOv5). RâCNN and RetinaNet models outperformed other ones with a detection average precision of 0.9. The results indicate that deep learning can act as an integral component of an automated system for high-throughput assessment of pest insects in orchards. Therefore, this can reduce the time for repetitive and laborious trap assessment but also increase the observed amount of sticky trap
Monitoring Agronomic Parameters of Winter Wheat Crops with Low-Cost UAV Imagery
Monitoring the dynamics in wheat crops requires near-term observations with high spatial resolution due to the complex factors influencing wheat growth variability. We studied the prospects for monitoring the biophysical parameters and nitrogen status in wheat crops with low-cost imagery acquired from unmanned aerial vehicles (UAV) over an 11 ha field. Flight missions were conducted at approximately 50 m in altitude with a commercial copter and camera systemâthree missions were performed between booting and maturing of the wheat plants and one mission after tillage. Ultra-high resolution orthoimages of 1.2 cm·pxâ1 and surface models were generated for each mission from the standard red, green and blue (RGB) aerial images. The image variables were extracted from image tone and surface models, e.g., RGB ratios, crop coverage and plant height. During each mission, 20 plots within the wheat canopy with 1 Ă 1 m2 sample support were selected in the field, and the leaf area index, plant height, fresh and dry biomass and nitrogen concentrations were measured. From the generated UAV imagery, we were able to follow the changes in early senescence at the individual plant level in the wheat crops. Changes in the pattern of the wheat canopy varied drastically from one mission to the next, which supported the need for instantaneous observations, as delivered by UAV imagery. The correlations between the biophysical parameters and image variables were highly significant during each mission, and the regression models calculated with the principal components of the image variables yielded R2 values between 0.70 and 0.97. In contrast, the models of the nitrogen concentrations yielded low R2 values with the best model obtained at flowering (R2 = 0.65). The nitrogen nutrition index was calculated with an accuracy of 0.10 to 0.11 NNI for each mission. For all models, information about the surface models and image tone was important. We conclude that low-cost RGB UAV imagery will strongly aid farmers in observing biophysical characteristics, but it is limited for observing the nitrogen status within wheat crops
- âŠ