Knowledge Distillation and Distribution Calibration for Few-Shot Object Detection

Authors

  • Qinghua Yang China University of Mining and Technology-Beijing
  • Yan Tian China University of Mining and Technology-Beijing
  • Tingting Xu China University of Mining and Technology
  • Zehua Wang The University of British Columbia
  • Jing Sun Beijing Polytechnic College
  • Fangyuan He Beijing Union University

DOI:

https://doi.org/10.5755/j01.itc.55.1.42926

Keywords:

Object detection, few-shot, knowledge distillation, distribution calibration

Abstract

Few-shot object detection (FSOD) aims to recognize and localize novel categories using only a limited number of annotated samples. Existing transfer learning–based approaches have attracted considerable attention for their structural simplicity and computational efficiency. However, merely fine-tuning the pretrained model parameters is insufficient to capture inter-class and intra-class relationships, thereby limiting the exploitation of transferable knowledge and further performance improvement. Therefore, we propose a prototype-guided semantic learning framework. By incorporating knowledge distillation, the method explicitly models transferable knowledge among classes for both classification and localization tasks. Specifically, a queue-based memory mechanism constructs dynamic class prototypes and distribution statistics in the feature space, enabling the modeling of class relationships. Classification knowledge transfer is achieved via Kullback-Leibler divergence, while localization knowledge transfer is guided through regression reweighting. Furthermore, to alleviate distribution bias from the scarcity of novel class samples, an adaptive distribution correction and augmentation strategy based on optimal transport is introduced to enhance novel class classification. Experimental results demonstrate that, compared with baseline methods, the proposed approach achieves 6% and 5% improvements in novel class mAP under the 1-shot setting on the VOC and COCO datasets, respectively. 

Downloads

Published

2026-04-03

Issue

Section

Articles