Six-Degree-of-Freedom Pose Estimation of Class-Level Objects Based on P2T-Net
DOI:
https://doi.org/10.5755/j01.itc.53.3.36121Keywords:
Transformer, pyramid pool, efficient self-attention mechanism, cross-modal fusion, 6DoF pose estimationAbstract
6D pose estimation of objects is widely used in the fields of augmented reality, robot operation, and unmanned driving. Due to the complexity and variability of real application scenarios, its task needs to deal with the interference such as light change, distance change, sensor noise, and mutual occlusion of chaotic placement. In application scenarios, the implementation of methods with low hardware cost and also high efficiency on accuracy and time cost is still a challenging problem. At this time, it is important to recognize the class of the object, determine the area of the object in the image, and estimate the 6D pose of the object that are still challenging problems. In this paper, we proposed a conceptually simple and data-efficient category-level 6 Degree-of-Freedom pose estimation network using Pyramid Pooling Transformer as the foundation network to enhance the accuracy in image classification, semantic segmentation, object detection, and instance segmentation with low hardware cost application background. In the cross-modal fusion phase, the implicit Deep recovery technique is used to improve the RGB-D feature representation capability, and the compact pyramid refinement operation can efficiently fuse multiple layers of features with high speed and few parameters. Compared with traditional methods, the methods we proposed have better resistance to occlusion, MAP of 10° 2cm and 10° 5cm can reach 81.4% and 87.1%, and MAP of 5° 2cm and 5° 5cm can reach 69.2% and 72.9%, which is ahead of NOCS and SPD in comparison test of public data set CAMERA and REAL. It has obvious advantages especially under the situation that large hardware and data base is not feasible.
Downloads
Published
Issue
Section
License
Copyright terms are indicated in the Republic of Lithuania Law on Copyright and Related Rights, Articles 4-37.