Tuesday, December 16, 2025

Dual-UAV system boosts color accuracy in crop remote sensing


Nanjing Agricultural University The Academy of Science





Tests showed that this method improved color fidelity by over 70%, aligning image color with ground-truth measurements and enhancing rice maturity prediction accuracy from R² = 0.28 to 0.67. The technology offers a scalable pathway for reliable high-throughput phenotyping and breeding applications in precision agriculture.

Plant color reflects physiological status, nutrient levels, and biotic or abiotic stress, making it a key trait in breeding and field management. Traditional visual inspection is subjective and inefficient, while satellite imagery lacks resolution for fine phenotyping. UAV-based remote sensing fills this gap, but outdoor imaging suffers from inconsistent illumination, camera variability, and flight-to-flight differences. Past solutions—image feature transfer, reference image selection, deep learning enhancement—improved color consistency to some extent, but accuracy was limited and performance depended strongly on reference quality. ColorChecker chart correction is widely used in photography and printing, yet covering each UAV view is impractical in field missions. These limitations highlighted the need for a real-time, mobile, and scalable color standardization method for agricultural remote sensing.

study (DOI: 10.1016/j.plaphe.2025.100101) published in Plant Phenomics on 5 September 2025 by Haiyan Cen’s team, Zhejiang University, demonstrates a reliable dual-UAV color correction method that significantly enhances the accuracy and consistency of field-scale RGB imagery, enabling more precise crop phenotyping and agricultural decision-making.

A evaluation was performed to verify the performance of the CoF-CC color correction method, beginning with tests across six different camera models in which images containing both the ColorChecker and rice fields were captured and corrected using a color correction matrix. Results showed that the method successfully reduced brightness and color discrepancies among original images, lowering the ΔE values of ColorChecker patches from 5.8–18.0 to 3.4–5.0, reflecting a 66.1% improvement in color accuracy. Color distributions plotted in LAB 3D space revealed tighter clusters after correction, and the intracluster distance decreased from 13.2 to 3.9, indicating a 70.2% enhancement in color consistency. Further accuracy assessment using large-scale field images confirmed these outcomes, with ΔE values across variable image sets dropping from 8.0–25.9 to 2.3–8.2, representing a 73.6% increase in accuracy. When color-cast images were corrected, artificial distortions were reduced to an average ΔE of 7.1. Comparisons with ground-truth measurements at 30 sampling points showed mean ΔE reduced from 18.2 to 5.0—a 72.7% improvement. The corrected images also enabled clearer visualization of ripening differences. To evaluate segmentation effectiveness, ground-truth leaf colors were mapped onto both original and corrected images. Original segmentation poorly matched actual leaf shapes, but corrected images aligned well with manual labeling, increasing overlap from 15.1% to 64.3%, demonstrating significantly improved target identification. Finally, the corrected orthomosaic allowed extraction of canopy color values for each plot, effectively differentiating breeding materials. While original image colors showed weak correlation with rice maturity (R² = 0.28), CoF-CC–derived colors achieved a strong linear relationship (R² = 0.67), confirming the method's value for phenotyping and maturity prediction.

Accurate field-scale color measurement plays a vital role in agricultural decision-making, supporting genotype evaluation in breeding nurseries, crop stress and nutrient assessment, harvest timing estimation, digital phenotyping, and large-scale monitoring for precision agriculture. By tightly leveraging the relationship between canopy color, chlorophyll content, and maturity, the CoF-CC pipeline ensures reliable extraction of color-based signals that are often masked by illumination variation, marking an important step toward standardized RGB datasets, improved data sharing across experiments, and scalable high-throughput phenotyping and automation in modern agriculture.

###

References

DOI

10.1016/j.plaphe.2025.100101

Original Source URl

https://doi.org/10.1016/j.plaphe.2025.100101

Funding information

This work was funded by the International S&T Cooperation Program of China (2024YFE0115000), the National Key R & D Program of China (2021YFD2000104), the National Natural Science Foundation of China (32371985), and the Fundamental Research Funds for Central Universities (226-2022-00217).

About Plant Phenomics

Plant Phenomics is dedicated to publishing novel research that will advance all aspects of plant phenotyping from the cell to the plant population levels using innovative combinations of sensor systems and data analytics. Plant Phenomics aims also to connect phenomics to other science domains, such as genomics, genetics, physiology, molecular biology, bioinformatics, statistics, mathematics, and computer sciences. Plant Phenomics should thus contribute to advance plant sciences and agriculture/forestry/horticulture by addressing key scientific challenges in the area of plant phenomics.

No comments: