Knowledge Distillation and Distribution Calibration for Few-Shot Object Detection
DOI:
https://doi.org/10.5755/j01.itc.55.1.42926Keywords:
Object detection, few-shot, knowledge distillation, distribution calibrationAbstract
Few-shot object detection (FSOD) aims to recognize and localize novel categories using only a limited number of annotated samples. Existing transfer learning–based approaches have attracted considerable attention for their structural simplicity and computational efficiency. However, merely fine-tuning the pretrained model parameters is insufficient to capture inter-class and intra-class relationships, thereby limiting the exploitation of transferable knowledge and further performance improvement. Therefore, we propose a prototype-guided semantic learning framework. By incorporating knowledge distillation, the method explicitly models transferable knowledge among classes for both classification and localization tasks. Specifically, a queue-based memory mechanism constructs dynamic class prototypes and distribution statistics in the feature space, enabling the modeling of class relationships. Classification knowledge transfer is achieved via Kullback-Leibler divergence, while localization knowledge transfer is guided through regression reweighting. Furthermore, to alleviate distribution bias from the scarcity of novel class samples, an adaptive distribution correction and augmentation strategy based on optimal transport is introduced to enhance novel class classification. Experimental results demonstrate that, compared with baseline methods, the proposed approach achieves 6% and 5% improvements in novel class mAP under the 1-shot setting on the VOC and COCO datasets, respectively.
Downloads
Published
Issue
Section
License
Copyright terms are indicated in the Republic of Lithuania Law on Copyright and Related Rights, Articles 4-37.


