Unbounded Scene Novel View Synthesis Method Based on Gaussian Point Cloud Near-Distant Separation
-
Graphical Abstract
-
Abstract
Due to the current lack of efficient scene representation methods for unbounded scenes, the reconstruction quality of the distant part of the scene is low, and artifacts and floaters easily produced in novel view synthesis. In order to efficiently represent unbounded scenes and improve the quality of unbounded scene novel view synthesis, we propose a two-part novel view synthesis algorithm for unbounded scenes based on Gaussian point cloud separation of near and far views, DoubleGS. First, a representation method for scene separation of near and far views based on Gaussian point clouds is proposed, which improves the quality of scene reconstruction; then uses pre-training initialization of distant point clouds and pruning of near point clouds to improve the overall reconstruction quality of the scene, reducing artifacts and floaters in novel view synthesis; and finally, based on the point cloud splatting algorithm, a two-stage differentiable Gaussian point cloud rendering pipeline is designed and implemented, which improves the rendering performance of Gaussian point cloud. Experiments conducted on several datasets containing multi-view images of unbounded scenes show that DoubleGS has excellent performance in multiple indicators such as PSNR, SSIM and LPIPS while ensuring real-time rendering. Compared with existing methods, the average rendering time is reduced by 23% compared to 3DGS in unbounded scenes.
-
-