This document summarizes the PU-Net architecture for point cloud upsampling. PU-Net takes sparse point clouds as input and uses a patch extraction network and feature embedding based on PointNet++ to learn point features at multiple scales. It then expands the features and reconstructs coordinates to generate upsampled point clouds. The network is evaluated on various 3D shape datasets, demonstrating state-of-the-art performance in upsampling accuracy and robustness to noise compared to other methods.
This document summarizes the PU-Net architecture for point cloud upsampling. PU-Net takes sparse point clouds as input and uses a patch extraction network and feature embedding based on PointNet++ to learn point features at multiple scales. It then expands the features and reconstructs coordinates to generate upsampled point clouds. The network is evaluated on various 3D shape datasets, demonstrating state-of-the-art performance in upsampling accuracy and robustness to noise compared to other methods.
This document summarizes the PU-Net architecture for point cloud upsampling. PU-Net takes sparse point clouds as input and uses a patch extraction network and feature embedding based on PointNet++ to learn point features at multiple scales. It then expands the features and reconstructs coordinates to generate upsampled point clouds. The network is evaluated on various 3D shape datasets, demonstrating state-of-the-art performance in upsampling accuracy and robustness to noise compared to other methods.
Upsampling Network CVPR2018 Introduction • Similar to 2D image’s super-resolution problem
• Inspired from PointNet++
Architecture Architecture cont’d • Patch Extraction: randomly select points and grow patches around them.
• Point Feature Embedding: from PointNet++, using hierarchical feature
learning, but with smaller radius.
• Multi-level feature aggregation: combine all stage of
interpolated(PointNet++) features. Architecture cont’d • Feature Expansion: expand feature number, then reshape to rN points.
• Coordinate Reconstruction: series of FCN to reduce dimension to
rN*3. Datasets • 60 models they collected, 40 for training, each model crop 100 patches, another 20 for testing, randomly choose 5000 points for input.
• SHREC15: 1200 models from 50 categories. Choose 1 for testing in
each category. Experiments • NUC : calculate uniformity, NUC lower means better.
• Compare with non-learnin-
based method Experiments • With learning-based methods Experiments • Surface reconstruction Experiments • Noisy inputs