no code implementations • CVPR 2023 • Jiabao Lei, Jiapeng Tang, Kui Jia
More specifically, we maintain an intermediate surface mesh used for rendering new RGBD views, which subsequently becomes complete by an inpainting network; each rendered RGBD view is later back-projected as a partial surface and is supplemented into the intermediate mesh.
1 code implementation • 20 Oct 2022 • Yongwei Chen, Rui Chen, Jiabao Lei, Yabin Zhang, Kui Jia
Creation of 3D content by stylization is a promising yet challenging problem in computer vision and graphics research.
1 code implementation • 18 Jun 2021 • Jiabao Lei, Kui Jia, Yi Ma
More specifically, we identify from the linear regions, partitioned by an MLP based implicit function, the analytic cells and analytic faces that are associated with the function's zero-level isosurface.
1 code implementation • ICCV 2021 • Jiapeng Tang, Jiabao Lei, Dan Xu, Feiying Ma, Kui Jia, Lei Zhang
To this end, we propose to learn implicit surface reconstruction by sign-agnostic optimization of convolutional occupancy networks, to simultaneously achieve advanced scalability to large-scale scenes, generality to novel shapes, and applicability to raw scans in a unified framework.
no code implementations • CVPR 2021 • Wenbin Zhao, Jiabao Lei, Yuxin Wen, JianGuo Zhang, Kui Jia
Motivated from a universal phenomenon that self-similar shape patterns of local surface patches repeat across the entire surface of an object, we aim to push forward the data-driven strategies and propose to learn a local implicit surface network for a shared, adaptive modeling of the entire surface for a direct surface reconstruction from raw point cloud; we also enhance the leveraging of surface self-similarities by improving correlations among the optimized latent codes of individual surface patches.
1 code implementation • ICML 2020 • Jiabao Lei, Kui Jia
This paper studies a problem of learning surface mesh via implicit functions in an emerging field of deep learning surface reconstruction, where implicit functions are popularly implemented as multi-layer perceptrons (MLPs) with rectified linear units (ReLU).