Classification of maize leaf diseases with deep learning: Performance evaluation of the proposed model and use of explicable artificial intelligence


Creative Commons License

Alpsalaz F., Özüpak Y., Aslan E., Uzel H.

CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, vol.262, pp.1-14, 2025 (SCI-Expanded)

  • Publication Type: Article / Article
  • Volume: 262
  • Publication Date: 2025
  • Doi Number: 10.1016/j.chemolab.2025.105412
  • Journal Name: CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, Applied Science & Technology Source, Biotechnology Research Abstracts, Chemical Abstracts Core, Chimica, Computer & Applied Sciences, EMBASE, INSPEC
  • Page Numbers: pp.1-14
  • Open Archive Collection: AVESIS Open Access Collection
  • Yozgat Bozok University Affiliated: Yes

Abstract

Maize leaf diseases pose significant threats to global agricultural productivity, yet traditional diagnostic methods are slow, subjective, and resource-intensive. This study proposes a lightweight and interpretable convolutional neural network (CNN) model for accurate and efficient classification of maize leaf diseases. Using the ‘Corn or Maize Leaf Disease Dataset’, the model classifies four disease categories Healthy, Gray Leaf Spot, Common Rust, and Northern Leaf Blight with 94.97 % accuracy and a micro-average AUC of 0.99. With only 1.22 million parameters, the model supports real-time inference on mobile devices, making it ideal for field applications. Data augmentation and transfer learning techniques were applied to ensure robust generalization. To enhance transparency and user trust, Explainable Artificial Intelligence (XAI) methods, including LIME and SHAP, were employed to identify disease-relevant features such as lesions and pustules, with SHAP achieving an IoU of 0.82. The proposed model outperformed benchmark models like ResNet50, MobileNetV2, and EfficientNetB0 in both accuracy and computational efficiency. Robustness tests under simulated environmental challenges confirmed its adaptability, with only a 2.82 % performance drop under extreme conditions. Comparative analyses validated its statistical significance and practical superiority. This model represents a reliable, fast, and explainable solution for precision agriculture, especially in resource-constrained environments. Future enhancements will include multi-angle imaging, multimodal inputs, and extended datasets to improve adaptability and scalability in real-world conditions.