Header logo is avg

Deep Marching Cubes: Learning Explicit Surface Representations

2018

Conference Paper

avg


Existing learning based solutions to 3D surface prediction cannot be trained end-to-end as they operate on intermediate representations (eg, TSDF) from which 3D surface meshes must be extracted in a post-processing step (eg, via the marching cubes algorithm). In this paper, we investigate the problem of end-to-end 3D surface prediction. We first demonstrate that the marching cubes algorithm is not differentiable and propose an alternative differentiable formulation which we insert as a final layer into a 3D convolutional neural network. We further propose a set of loss functions which allow for training our model with sparse point supervision. Our experiments demonstrate that the model allows for predicting sub-voxel accurate 3D shapes of arbitrary topology. Additionally, it learns to complete shapes and to separate an object's inside from its outside even in the presence of sparse and incomplete ground truth. We investigate the benefits of our approach on the task of inferring shapes from 3D point clouds. Our model is flexible and can be combined with a variety of shape encoder and shape inference techniques.

Author(s): Yiyi Liao and Simon Donne and Andreas Geiger
Book Title: IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Year: 2018
Publisher: IEEE Computer Society

Department(s): Autonomous Vision
Research Project(s): Deep Marching Cubes
Bibtex Type: Conference Paper (inproceedings)
Paper Type: Conference

Event Name: IEEE International Conference on Computer Vision and Pattern Recognition (CVPR) 2018
Event Place: Salt Lake City, USA

Links: pdf
suppmat
Video
Project Page
Video:

BibTex

@inproceedings{Liao2018CVPR,
  title = {Deep Marching Cubes: Learning Explicit Surface Representations },
  author = {Liao, Yiyi and Donne, Simon and Geiger, Andreas},
  booktitle = {IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
  publisher = {IEEE Computer Society},
  year = {2018}
}