An All-In-One, Low-Cost Photogrammetry Rig for 3D Plant Modelling and Phenotyping

Abstract

Photogrammetry, the science of generating 3D models of objects from photographs, offers a comprehensive method for acquiring, studying, and analyzing detailed information about the structure of objects. Utilizing the cost-effective Structure from Motion (SfM) technique, it is possible to generate 3D models from numerous 2D images taken from various angles. Point clouds represent a standard format for 3D data generated by depth sensors such as LIDARs and RGB-D cameras. Despite their utility, high-quality 3D scanners, costing upwards of 70,000,remainrelativelyexpensiveformanyresearchersandpractitionerswithintheagriculturalsector.Inresponse,wehavedevelopedalowcost,closerangephotogrammetryrig,pricedat70,000, remain relatively expensive for many researchers and practitioners within the agricultural sector. In response, we have developed a low-cost, close-range photogrammetry rig, priced at 2,600, to support agronomists, plant scientists, and breeders. This work outlines the development of our device, which integrates a multi-camera system featuring the Arducam 64MP Autofocus Quad-Camera Kit, a rotary table from Ortery, and a Raspberry Pi for comprehensive control and processing. Our scanner efficiently captures detailed plant 3D data, offering a valuable tool for non-destructive, automatic, and robust 3D phenotyping. It is possible to use our device across various applications, including growth monitoring and the extraction of plant traits. Specifically, we have leveraged the device to measure the canopy volume of different wheat genotypes by computing the convex hull from the 3D data. Furthermore, through our photogrammetry rig, we have developed a high-throughput, quantitative trait index for wheat to identify distinct planophile and erectophile canopy architectures.MitacsMaster of Science in Applied Computer Scienc

Similar works

Full text

thumbnail-image

WinnSpace Repository

redirect
Last time updated on 08/09/2024

This paper was published in WinnSpace Repository.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.

Licence: info:eu-repo/semantics/openAccess