We use cookies to improve your experience with our site.
Gao-Yang Zhang, Ying-Xi Chen, Han-Chao Li, Xin-Guo Liu. Efficient and Structure-Aware 3D Reconstruction via Differentiable Primitive Abstraction[J]. Journal of Computer Science and Technology. DOI: 10.1007/s11390-025-5239-9
Citation: Gao-Yang Zhang, Ying-Xi Chen, Han-Chao Li, Xin-Guo Liu. Efficient and Structure-Aware 3D Reconstruction via Differentiable Primitive Abstraction[J]. Journal of Computer Science and Technology. DOI: 10.1007/s11390-025-5239-9

Efficient and Structure-Aware 3D Reconstruction via Differentiable Primitive Abstraction

  • Reconstructing detailed 3D models from multi-view images often involves a trade-off between efficiency and fidelity. Existing methods based on volumetric representations or dense meshes can be computationally expensive, while primitive-based methods struggle to capture fine geometric details. We propose a novel method that addresses this challenge by combining differentiable primitive abstraction with adaptive mesh refinement. Our method first abstracts the scene into a set of cuboid primitives represented by analytical signed distance functions (SDFs), enabling part separability. This stage leverages differentiable volume rendering to efficiently optimize the primitives’ poses and sizes. Subsequently, an automatic coarse-to-fine refinement procedure, guided by rendering loss, restores fine geometric details. Our approach yields high-quality, part-separable meshes with manageable complexities, suitable for applications requiring part manipulation and efficient rendering. We demonstrate the effectiveness of our method on the DTU, BlendedMVS, and Tanks&Temples datasets, achieving a better balance between mesh complexity and reconstruction fidelity compared to existing techniques.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return