ARTFEED — Contemporary Art Intelligence

AgriKD: Cross-Architecture Knowledge Distillation for Leaf Disease Classification

other · 2026-05-06

A novel framework named AgriKD facilitates effective classification of leaf diseases on edge devices by transferring insights from a Vision Transformer (ViT) teacher to a smaller convolutional student model. This method tackles the computational challenges posed by ViTs in resource-limited field settings. AgriKD employs various distillation targets at the output, feature, and relational levels to connect the representational divide between Transformer and CNN architectures. This research has been made available on arXiv under ID 2605.01355.

Key facts

  • AgriKD is a cross-architecture knowledge distillation framework.
  • It transfers knowledge from a Vision Transformer (ViT) teacher to a compact convolutional student model.
  • The framework targets efficient edge deployment for leaf disease classification.
  • Multiple distillation objectives are used at output, feature, and relational levels.
  • The approach bridges the representational gap between Transformer and CNN architectures.
  • Vision Transformers provide strong representation but have high computational cost.
  • The paper is available on arXiv with ID 2605.01355.

Entities

Institutions

  • arXiv

Sources