Phenology

PhenoSR: Enhancing organ-level phenotyping with super-resolution RGB UAV imagery for large-scale field experiments

Abstract

Organ-level phenotyping is critical for crop breeding and precision farming by providing information directly associated with yield and quality. Unmanned aerial vehicles (UAVs) are widely utilized in large-scale field experiments for their versatile image collection capabilities. However, RGB images captured at high altitudes often lack the resolution for accurate organ-level phenotyping, as collection efficiency is prioritized. Deep learning-based image super-resolution (SR) methods can enhance image resolution, but they usually fail to address the challenge of obtaining paired low-resolution (LR) and high-resolution (HR) data for training under field conditions. Moreover, the varying significance of organ-level phenotyping across different regions in UAV images is often neglected, slowing down reconstruction. To overcome these challenges, a degradation model and a multiscale scaling strategy were proposed to generate paired datasets. Then, a semantic score was introduced to identify the significance of image regions for organ-level phenotyping. Finally, an SR algorithm (PhenoSR) based on a coarse-refined architecture was proposed to recover organ textures. PhenoSR recovered wheat organ textures in UAV images collected at flight heights ranging from 10 to 40 m. Compared to LR images, the natural image quality evaluator (NIQE) and Fréchet inception distance (FID) metrics decreased by 71.37% and 21.53%, respectively, while improving hyperIQA by 39.36%. PhenoSR outperformed eight SR algorithms, achieving a 12.31% reduction in FID and a 25.53% improvement in hyperIQA on average. Moreover, PhenoSR enhanced organ-level wheat phenotyping tasks, such as plot segmentation, spike counting, flowering spike detection, and awn morphology identification, and can be extended to other crops and multispectral imagery. This study presents an innovative and universal technology for enhancing organ-level phenotyping accuracy and efficiency with UAV platforms, thereby accelerating the identification and utilization of crop germplasm resources.

OSNet: an oriented instance segmentation network of breeding plot extraction from UAV RGB imagery

Abstract

Drones have enabled large-scale breeding and cultivation experiments. However, extracting individual breeding plots from aerial images is a key prerequisite and urgent demand for extracting variety-level traits. The main difficulties in plot extraction include irregular rotation angles of the plots, ambiguous gaps both within and between plots, and variable color contrasts between the vegetation and the background. To solve these challenges, a novel oriented instance segmentation network (OSNet) is proposed by leveraging a global context transformer (GCT) and an oriented region proposal network (RPN). The performance was assessed using a welllabeled dataset with 960 plots of 160 wheat varieties across two years. Results show that OSNet achieved the AP@0.5 of 0.917, F1-score of 0.959, Accuracy of 0.966, IoU of 0.912, Recall of 0.934, and Plot-a of 0.999. OSNet outperformed five state-of-the-art (SOTA) networks with an average improvement of 3.08 %, 1.42 %, 1.19 %, 1.70 %, 1.79 %, and 0.04 % in AP@0.5, F1-score, Accuracy, IoU, Recall, and Plot-a, respectively. The sensitivity analysis proved that OSNet consistently achieved stable segmentation accuracy across different rotation angles and growth stages. The interpretability through ablation analysis showed that OSNet benefits from the oriented proposal and global information. Furthermore, OSNet can be transferred to new datasets with various years, crops, and data dimensions, supporting typical phenotyping tasks such as 2D wheat spike detection (r = 0.91) and 3D canopy height measurement (r = 0.89). The innovative methodology will be a fundamental tool for processing drone imagery, accelerating phenotypic trait extraction across various varieties and thereby expediting the breeding process.

PhenoNet: A two-stage lightweight deep learning framework for real-time wheat phenophase classification

Abstract

The real-time monitoring of wheat phenology variations among different varieties and their adaptive responses to environmental conditions is essential for advancing breeding efforts and improving cultivation management. Many remote sensing efforts have been made to relieve the challenges of key phenophase detection. However, existing solutions are not accurate enough to discriminate adjacent phenophases with subtle organ changes, and they are not real-time, such as the vegetation index curve-based methods relying on entire growth stage data after the experiment was finished. Furthermore, it is key to improving the efficiency, scalability, and availability of phenological studies. This study proposes a two-stage deep learning framework called PhenoNet for the accurate, efficient, and real-time classification of key wheat phenophases. PhenoNet comprises a lightweight encoder module (PhenoViT) and a long short-term memory (LSTM) module. The performance of PhenoNet was assessed using a well-labeled, multi-variety, and large-volume dataset (WheatPheno). The results show that PhenoNet achieved an overall accuracy (OA) of 0.945, kappa coefficients (Kappa) of 0.928, and F1-score (F1) of 0.941. Additionally, the network parameters (Params), number of operations measured by multiply-adds (MAdds), and graphics processing unit memory required for classification (Memory) were 0.889 million (M), 0.093 Giga times (G), and 8.0 Megabytes (MB), respectively. PhenoNet outperformed eleven state-of-the-art deep learning networks, achieving an average improvement of 3.7% in OA, 5.1% in Kappa, and 4.1% in F1, while reducing average Params, MAdds, and Memory by 78.4%, 85.0%, and 75.1%, respectively. The feature visualization and ablation analysis explained that PhenoNet mainly benefited from using time-series information and lightweight modules. Furthermore, PhenoNet can be effectively transferred across years, achieving a high OA of 0.981 using a two-stage transfer learning strategy. Furthermore, an extensible web platform that integrates WheatPheno and PhenoNet and ensures that the work done in this study is accessible, interoperable, and reusable has been developed ( https://phenonet.org/).