Advanced Search
Chen Zhuhan, Huang Hui. Global Structure Guided Parametric Primitive Detection of Man-Made Objects[J]. Journal of Computer-Aided Design & Computer Graphics, 2023, 35(11): 1769-1779. DOI: 10.3724/SP.J.1089.2023.19754
Citation: Chen Zhuhan, Huang Hui. Global Structure Guided Parametric Primitive Detection of Man-Made Objects[J]. Journal of Computer-Aided Design & Computer Graphics, 2023, 35(11): 1769-1779. DOI: 10.3724/SP.J.1089.2023.19754

Global Structure Guided Parametric Primitive Detection of Man-Made Objects

  • Point cloud is a common form of expressing 3D data. Extracting geometric primitives from point clouds can help people quickly understand and process scene information, and also facilitate other subsequent tasks. In order to better utilize the global structural relationship prevalent in man-made objects, this paper propose RelationNet to enhance its positive guidance in the process of primitive detection. RelationNet consists of two sub-modules. First, in order to better encode the structural relationship between 3D points and their primitives, RelationNet includes a spatial offset prediction module, which is used to predict the offset vector of 3D points to the center of the primitives where they are located. The location awareness of primitives also provides more valid features for subsequent segmentation tasks. Second, the primitives of man-made objects often have structural relationships, such as parallel, vertical, and axis alignment. In order to use these relationships to improve the detection results of geometric primitives, RelationNet also includes a global structural relationship extraction module, which uses the parameters obtained after primitive fitting to determine the structural relationship between each primitive, and sets the corresponding loss function to supervise the extracted results. This method is tested on the large scale ABC dataset and compared with mainstream methods, such as SPFN (supervised primitive fitting network) and ParseNet. The experimental results show that the accuracy of RelationNet in primitive segmentation and primitive classification reaches MIoU of 85.32% and 90.10%, respectively, which significantly outperforms state-of-the- art methods.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return