期刊
COMPUTERS & GRAPHICS-UK
卷 115, 期 -, 页码 69-80出版社
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.cag.2023.07.004
关键词
Graph Neural Networks; Graph Attention Networks; Relief pattern classification; Features learning
This paper proposes a novel approach based on Graph Neural Networks for 3D mesh relief pattern classification. The approach captures local and global features of the relief patterns by constructing local and global mesh structures. Experimental results on SHREC'17 and SHREC'20 relief patterns track datasets demonstrate the superior performance of the proposed approach.
Relief patterns represent a surface characteristic that can be seen as the 3D counterpart of the texture concept in 2D images. Such characteristic is well distinct from the 3D object shape but represents a good information source to recognize the object itself. The majority of state-of-the-art techniques for 2D images rely on convolution-based filtering so, the idea of extending such techniques to the mesh manifold domain is quite intriguing as much as challenging. In this paper, we propose a novel approach based on Graph Neural Networks for 3D mesh relief pattern classification. To this end, we designed a bi-level architecture that learns on data structures computed thanks to a mesh resampling algorithm that allows us to represent local surface patches uniformly, while keeping a consistent points order. The local mesh structures are represented by SpiderPatches, that aim to capture local features of the 3D mesh surface, providing a fine-grained rich representation of the relief patterns; global structures are instead captured by MeshGraphs, whose nodes are SpiderPatches, representing the mesh at a macroscopic level. We tested our architecture using SpiderPatches and MeshGraphs on the original meshes of the SHREC'17 and SHREC'20 relief patterns track datasets, showing superior performance to that reported in the literature using a comparable experimental setting. & COPY; 2023 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据