New phenotyping approach analyzes crop traits at the 3D level
Researchers devise a novel method for the accurate quantification of crop traits using multi-source data and time-series images
Peer-Reviewed PublicationThe steady decline in cultivable land owing to the rapidly increasing global population has necessitated the use of efficient plant breeding methods that could be used to improve agricultural yields. However, in addition to genetic methods, we need approaches to control and improve complex crop traits. To this end, plant scientists make use of various cutting-edge imaging techniques that quantify crop traits (height, leaf shape, leaf color, etc.). Traditional imaging methods, however, are tedious, destructive, and non-sustainable. Moreover, since plants exist in a three-dimensional (3D) space, making accurate estimation is difficult using two-dimensional (2D) images.
The high throughput phenotyping (HTP) platform enables field data collection on a regular basis. It captures images using RGB (red, green, and blue) cameras and Light Detection and Ranging (LiDAR). The RGB camera produces high-resolution images from which traits such as canopy structure and the number and appearance of specific plant organs can be extracted. While RGB cameras are affected by light, LiDAR is not. As a result, LiDAR is widely used in self-driven vehicles for mapping and navigation. So, can LiDAR provide detailed descriptions of crop features as well?
To answer this question, scientists from China have now developed a rail-based field phenotyping technique that uses LiDAR for quantifying plant traits. The study led by Professor Xinyu Guo from National Engineering Research Center for Information Technology in Agriculture, China, was recently published in Plant Phenomics on 28 March 2023. Prof. Guo explains, “It is difficult to align the point cloud data and extract accurate phenotypic traits of plant populations. In this study, high-throughput, time-series raw data of field maize populations were collected using a field rail-based phenotyping platform with LiDAR and an RGB camera.”
The research team incorporated LiDAR into the design of the rail-based field phenotyping platform. To achieve this, the research team used orthorectification, a process that converts raw field images into usable forms by removing sensor, motion, and terrain-related distortions. The rectified images were then used to accurately quantify various crop traits after subjecting them to algorithmic processing.
Next, the team used time-series-based high-throughput plant phenotyping to determine plant height in a maize field. This method entails studying field images captured at regular time intervals to perform non-destructive analysis of the desired plant traits (in this case, plant height).
“Alignment errors in time-series point cloud data were minimized by coupling field orthorectified images and point clouds. The proposed method integrates point cloud data acquisition, alignment, filtering, and segmentation algorithms,” adds Prof. Guo.
The results were impressive and reassuring: The plant heights of 13 maize cultivars obtained using the aforementioned technique strongly correlated with the manual measurements. In other words, plant height determined using the rail-based field phenotyping platform was in agreement with the height measured using established manual techniques. The research team also noted that the measurement accuracy increased when data acquired from multiple sources replaced single-source data.
Although effective, the technique has a few drawbacks. For example, during image acquisition, leaf crossing, shading, and overlapping result in partial data loss. The team is working to resolve these issues.
“The method can also be used to compare growth rates between cultivars or estimate botanical traits, which are features of interest to crop modelers and breeders. Therefore, this research can provide data supporting modern breeding,” concludes Prof. Guo.
Kudos to the research team for their phenomenal innovations!
###
Reference
Authors
Yinglun Li,1,2 Weiliang Wen,1,2 Jiangchuan Fan,1,2 Wenbo Gou,1,2 Shenghao Gu,1,2 Xianju Lu,1,2 Zetao Yu,2 Xiaodong Wang,2 and Xinyu Guo1,2
Affiliations
1Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China.
2Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing 100097, China.
JOURNAL
Plant Phenomics
METHOD OF RESEARCH
Computational simulation/modeling
SUBJECT OF RESEARCH
Not applicable
ARTICLE TITLE
Multi-Source Data Fusion Improves Time-Series Phenotype Accuracy in Maize under a Field High-Throughput Phenotyping Platform
ARTICLE PUBLICATION DATE
20-Apr-2023