CORE
🇺🇦
make metadata, not war
Services
Services overview
Explore all CORE services
Access to raw data
API
Dataset
FastSync
Content discovery
Recommender
Discovery
OAI identifiers
OAI Resolver
Managing content
Dashboard
Bespoke contracts
Consultancy services
Support us
Support us
Membership
Sponsorship
Community governance
Advisory Board
Board of supporters
Research network
About
About us
Our mission
Team
Blog
FAQs
Contact us
Fully-automated root image analysis (faRIA)
Authors
Thomas Altmann
Evgeny Gladilin
+5 more
Michael Henke
Astrid Junker
Narendra Narisetti
Jörn Ostermann
Christiane Seiler
Publication date
1 January 2021
Publisher
London : Nature Publishing Group
Doi
Cite
View
on
PubMed
Abstract
High-throughput root phenotyping in the soil became an indispensable quantitative tool for the assessment of effects of climatic factors and molecular perturbation on plant root morphology, development and function. To efficiently analyse a large amount of structurally complex soil-root images advanced methods for automated image segmentation are required. Due to often unavoidable overlap between the intensity of fore- and background regions simple thresholding methods are, generally, not suitable for the segmentation of root regions. Higher-level cognitive models such as convolutional neural networks (CNN) provide capabilities for segmenting roots from heterogeneous and noisy background structures, however, they require a representative set of manually segmented (ground truth) images. Here, we present a GUI-based tool for fully automated quantitative analysis of root images using a pre-trained CNN model, which relies on an extension of the U-Net architecture. The developed CNN framework was designed to efficiently segment root structures of different size, shape and optical contrast using low budget hardware systems. The CNN model was trained on a set of 6465 masks derived from 182 manually segmented near-infrared (NIR) maize root images. Our experimental results show that the proposed approach achieves a Dice coefficient of 0.87 and outperforms existing tools (e.g., SegRoot) with Dice coefficient of 0.67 by application not only to NIR but also to other imaging modalities and plant species such as barley and arabidopsis soil-root images from LED-rhizotron and UV imaging systems, respectively. In summary, the developed software framework enables users to efficiently analyse soil-root images in an automated manner (i.e. without manual interaction with data and/or parameter tuning) providing quantitative plant scientists with a powerful analytical tool. © 2021, The Author(s)
Similar works
Full text
Open in the Core reader
Download PDF
Available Versions
Hochschulbibliothekszentrum des Landes Nordrhein-Westfalen (hbz)
See this paper in CORE
Go to the repository landing page
Download from data provider
oai:frl.publisso.de:frl:643826...
Last time updated on 09/12/2022
Institutionelles Repositorium der Leibniz Universität Hannover
See this paper in CORE
Go to the repository landing page
Download from data provider
oai:www.repo.uni-hannover.de:1...
Last time updated on 01/11/2022
Directory of Open Access Journals
See this paper in CORE
Go to the repository landing page
Download from data provider
oai:doaj.org/article:a4060d5db...
Last time updated on 16/11/2021