Few-Shot Learning on Edge Devices Using CLIP: A Resource-Efficient Approach for Image Classification

Authors

  • Jin Lu Guangdong Key Laboratory of Big Data Intelligence for Vocational Education, Shenzhen Polytechnic University, Shenzhen 518055, Guangdong, China

DOI:

https://doi.org/10.5755/j01.itc.53.3.36943

Keywords:

Few-shot learning, CLIP model, image classification, edge devices, deep learning

Abstract

In the field of deep learning, traditional image classification tasks typically require extensive annotated datasets and complex model training processes, which pose significant challenges for deployment on resource-constrained edge devices. To address these challenges, this study introduces a few-shot learning method based on OpenAI's CLIP model that significantly reduces computational demands by eliminating the need to run a text encoder at the inference stage. By pre-computing the embedding centers of classification text with a small set of image-text data, our approach enables the direct use of CLIP’s image encoder and pre-calculated text embeddings for efficient image classification. This adaptation not only allows for high-precision classification tasks on edge devices with limited computing capabilities but also achieves accuracy and recall rates that closely approximate those of the pre-trained ResNet approach while using far less data. Furthermore, our method halves the memory usage compared to other large-scale visual models of similar capacity by avoiding the use of a text encoder during inference, making it particularly suitable for low-resource environments. This comparative advantage underscores the efficiency of our approach in handling few-shot image classification tasks, demonstrating both competitive accuracy and practical viability in resource-limited settings. The outcomes of this research not only highlight the potential of the CLIP model in few-shot learning scenarios but also pave a new path for efficient, low-resource deep learning applications in edge computing environments.

Downloads

Published

2024-09-25

Issue

Section

Articles