4.6 Article

3CROSSNet: Cross-Level Cross-Scale Cross-Attention Network for Point Cloud Representation

Journal

IEEE ROBOTICS AND AUTOMATION LETTERS
Volume 7, Issue 2, Pages 3718-3725

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LRA.2022.3147907

Keywords

Point cloud compression; Three-dimensional displays; Feature extraction; Representation learning; Correlation; Task analysis; Deep learning; Classification; point cloud; segmentation; self attention

Categories

Funding

  1. National Natural Science Foundation of China [62002299]
  2. Natural Science Foundation of Chongqing, China [cstc2020jcyj-msxmX0126]
  3. Fundamental Research Funds for the Central Universities [SWU120005]

Ask authors/readers for more resources

This paper introduces the application of self-attention mechanism in point cloud processing and proposes a network called 3CROSSNet for point cloud representation learning. The network includes a point-wise feature pyramid module and cross-level cross-attention modules to enhance performance.
Self-attention mechanism recently achieves impressive advancement in Natural Language Processing (NLP) and Image Processing domains. Its permutation invariance property makes it ideally suitable for point cloud processing. Inspired by this remarkable success, we propose an end-to-end architecture, dubbed Cross-Level Cross-Scale Cross-Attention Network (3CROSSNet), for point cloud representation learning. First, a point-wise feature pyramid module is introduced to hierarchically extract features from different scales or resolutions. Then a cross-level cross-attention module is designed to model long-range inter-level and intra-level dependencies. Finally, we develop a cross-scale cross-attention module to capture interactions between-and-within scales for representation enhancement. Compared with state-of-the-art approaches, our network can obtain competitive performance on challenging 3D object classification, point cloud segmentation tasks via comprehensive experimental evaluation. The source code and trained models are available at.(1)

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available