ABC: Geometric Shape Segmentation - CVPR 2020

Organized by artonson - Current server time: July 10, 2020, 4:45 p.m. UTC

Current

Development
June 1, 2020, midnight UTC

Next

Final
Dec. 1, 2020, midnight UTC

End

Competition Ends
Dec. 31, 2020, 11:59 p.m. UTC

ABC Geometry Challenge: Geometric Shape Segmentation

In the geometric shape segmentation challenge, participants have to predict segments of 3D point clouds which are located close to the feature lines of an underlying shape. In the present dataset and challenge, feature lines are identified with surface lines where surface normals undergo a change of at least 18°. Point clouds of CAD models are provided, which are randomly sampled from the surface with 4K points. For all points, ground truth binary segmentation masks derived from the CAD surface descriptions are given in the training set and have to be estimated for the validation and test sets.

For details about the other ABC Geometry Challenges and the workshop visit:

https://sites.google.com/view/dlgc-workshop-cvpr2020/home

Please refer to the following paper if you participate in this challenge or use the dataset for your approach:

@InProceedings{Koch_2019_CVPR,
author = {Koch, Sebastian and Matveev, Albert and Jiang, Zhongshi and Williams, Francis and Artemov, Alexey and Burnaev, Evgeny and Alexa, Marc and Zorin, Denis and Panozzo, Daniele},
title = {ABC: A Big CAD Model Dataset For Geometric Deep Learning},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2019}
}

Evaluation

We supply a training dataset with ground truth segmentation values for training your model, a validation dataset without ground truth segmentation values, for which you can upload your estimations and receive feedback on your performance, as well as a testing dataset without ground truth segmentation values for which you can submit your estimations without immediate feedback. The evaluation on the testing dataset will result in the final score.

One binary (zero or one) segmentation value has to be estimated per point, and all estimated segmentation values for the point cloud have to be written into a text file with the same index as the point cloud file, suffixed by "_target", and in the same order as the points (see example submission in starting kit).

For evaluation, the estimated per-point segmentation values pPRED are compared to the ground truth segmentations with two measures: the balanced accuracy score and intersection-over-union score. See our starting kit for python evaluation code.

The final reported scores are mean Accuracy and IoU scores computed for each of the point clouds, computed separately for different resolutions (HighRes, MedRes, and LowRes). In total, 6 scores will be computed:

  • HighRes Accuracy (high_res_balanced_accuracy_score)
  • MedRes Accuracy (med_res_balanced_accuracy_score)
  • LowRes Accuracy (low_res_balanced_accuracy_score)
  • HighRes mIoU (high_res_iou_score)
  • MedRes mIoU (med_res_iou_score)
  • LowRes mIoU (low_res_iou_score)

Note: you can submit results separately for different resolutions (e.g., HighRes, MedRes, or LowRes only) to avoid evaluation timeouts.

Terms and Conditions

Participants can train or optimize their approach on the supplied training dataset and validate their performance on the validation dataset. To participate in the challenge, they can submit the estimated results for the testing dataset until the submission end date. The final evaluation will be run on the estimated distances for the testing data.

Download Size (mb) Phase
Starting Kit 0.025 #1 Development
Public Data 0.001 #1 Development
Public Data 0.001 #2 Final

Development

Start: June 1, 2020, midnight

Description: Development phase: submit results for evaluation, with feed-back provided on the validation set only.

Final

Start: Dec. 1, 2020, midnight

Description: Final phase: submissions from the previous phase are automatically cloned and used to compute the final score, with feed-back provided on the full test set.

Competition Ends

Dec. 31, 2020, 11:59 p.m.

You must be logged in to participate in competitions.

Sign In