SenNet: A dual-branch image semantic segmentation network for wheat senescence evaluation and high-yielding variety screening

Authors: Jiaqi Yao , Shichao Jin , Jingrong Zang , Ruinan Zhang , Yu Wang , Yanjun Su , Qinghua Guo , Yanfeng Ding , Dong Jiang

Journal: Computers and Electronics in Agriculture

Impact Factor: 7.7

Year: 2025

Type: Journal Paper

Tags: Deep Learning Computer Vision Agriculture Phenotyping

Abstract

Wheat is one of the three primary staple crops globally, with the senescence of its leaves having a direct effect on yield. However, conventional senescence evaluation methods are mainly based on visual scoring, which are subjective, time-consuming, and hamper the investigation of mechanisms between senescence process and yield formation. High-throughput image-based plant phenotyping techniques offer a promising approach. However, extracting senescence-related semantic information from images presents challenges, including blurred edge segmentation, inadequate characterization of senescence features, and interference from complex field environments. Therefore, this study proposes a dual-branch image senescence segmentation model (SenNet), which integrates edge priors and local–global attention mechanisms, including local–global hierarchical attention mechanisms, gated convolution, and positional encoding modules. First, a wheat senescence dynamics image dataset (19530 images) was constructed, comprising 509 wheat varieties from a two-year and two-replicate field experiments. Then, the SenNet model achieved senescence image segmentation for various wheat varieties, enabling senescence dynamics analysis and high-yielding variety screening. The results showed that: 1) The mean Intersection over Union (mIoU) of the SenNet model was 95.41 %, which represented a 4.01 % improvement over the average mIoU of seven state-of-the-art models. 2) The contributions of the local–global hierarchical attention mechanism, gated convolution, and positional encoding module to the accuracy improvement of SenNet were 3.15 %, 1.62 %, and 1.03 %, respectively. 3) SenNet can be transferred across years and locations. The mIoU accuracy of the SenNet across locations is 96.01 %. Furthermore, the model trained in 2023 can be transferred to 2022 and 2024, achieving mIoU accuracies of 93.75 % and 93.27 %. 4) High-yielding varieties typically experience a later onset of senescence and faster senescence in later stages. Based on the senescence law, this study further constructed new dynamic traits of senescence (e.g., AreaUnderCurve). Leveraging the random forest-based yield prediction (R2 = 0.68) from the dynamic traits, high-yielding varieties were screened with an average precision, recall, F1 score, and accuracy of 81 %, 79 %, 80 %, and 87 %, respectively. This study provides an efficient method for monitoring senescence dynamics and predicting yield, offering new insights into the screening of high-yielding varieties.

Contributions

  • To design a wheat senescence segmentation model by incorporating an edge-prior branch and a local–global attention mechanism.
  • To verify the model’s accuracy and transferability across data from different years (2023 and 2024) and locations.
  • To uncover law of senescence of high-yielding varieties and achieve high-yielding variety screening. These technique and scientific innovations in senescence studies will provide essential theoretical and technical support for understanding the physiological mechanisms of yield formation and screening high-yielding wheat varieties.

Citation

@article{yao2025sennet,
  title={SenNet: A dual-branch image semantic segmentation network for wheat senescence evaluation and high-yielding variety screening},
  author={Yao, Jiaqi and Jin, Shichao and Zang, Jingrong and Zhang, Ruinan and Wang, Yu and Su, Yanjun and Guo, Qinghua and Ding, Yanfeng and Jiang, Dong},
  journal={Computers and Electronics in Agriculture},
  volume={237},
  pages={110632},
  year={2025},
  publisher={Elsevier}
}