ARTFEED — Contemporary Art Intelligence

Knowledge Distillation Framework for Student Misconception Classification

other · 2026-05-16

A novel two-phase knowledge distillation framework tackles issues related to identifying student misconceptions, such as limited data, long-tail distributions, ambiguous error boundaries, and noisy annotations. Instead of generating extensive datasets, this method extracts valuable samples from the available data. The initial phase executes traditional distillation to convey task skills. In the subsequent phase, a dual-layer marginal selection mechanism is introduced, which relies on cognitive uncertainty to pinpoint four essential sample types, utilizing teacher model uncertainty and confidence variations. This strategy circumvents the deployment dilemma where large models succumb to pretraining biases and struggle on edge devices, while smaller models tend to overfit to noise.

Key facts

  • arXiv:2605.14752
  • Two-stage knowledge distillation framework
  • Addresses data scarcity and long-tail distribution
  • Dual-layer marginal selection mechanism
  • Cognitive uncertainty guides sample selection
  • Avoids large-scale data synthesis
  • Deployment paradox between large and small models
  • Focuses on authentic student reasoning

Entities

Sources