COCO 2018 Stuff Segmentation Challenge

Organized by nightrome - Current server time: Aug. 17, 2018, 11:03 p.m. UTC

Previous

val2017
June 30, 2018, midnight UTC

Current

test-dev2017
June 30, 2018, midnight UTC

End

Competition Ends
Jan. 1, 2050, midnight UTC

COCO 2018 Stuff Segmentation Challenge

 

The COCO Stuff Segmentation Task is designed to push the state of the art in semantic segmentation of stuff classes. Whereas the object detection task addresses thing classes (person, car, elephant), this task focuses on stuff classes (grass, wall, sky). For full details of the stuff segmentation task please see the stuff evaluation page. Note: the newly introduced panoptic segmentation task addresses recognition of both things and stuff classes simultaneously.

Things are objects with a specific size and shape, that are often composed of parts. Stuff classes are background materials that are defined by homogeneous or repetitive patterns of fine-scale properties, but have no specific or distinctive spatial extent or shape. Why the focus on stuff? Stuff covers about 66% of the pixels in COCO. It allows us to explain important aspects of an image, including scene type; which thing classes are likely to be present and their location; as well as geometric properties of the scene. The COCO Stuff Segmentation Task builds on the COCO-Stuff project as described on this website and in this research paper. This task includes and extends the original dataset release. Please note that in order to scale annotation, stuff segmentations were collected on superpixel segmentations of an image.

The stuff segmentation task will not be featured at the Joint COCO and Mapillary Recognition Challenge Workshop at ECCV 2018. Researchers may continue to submit to test-dev, but not to test-challenge, and results will not be presented at the workshop. Instead, we encourage researchers to participate in the new panoptic segmentation task which subsumes both stuff and instance segmentation and presents new challenges for the community.

From 2018 onward, this task includes all 164K COCO images (train 118K, val 5K, test-dev 20K, test-challenge 20K) with annotations for 91 stuff classes and 1 'other' class. Annotations for train and val are now available for download, while test set annotations will remain private. We provide annotations in the COCO JSON format as well as PNG pixel maps.

This CodaLab evaluation server provides a platform to measure performance on the val, test-dev and test-challenge sets. The COCO Stuff API is provided to compute several performance metrics to evaluate semantic segmentation.

To participate, you can find instructions on the COCO website. In particular, please see the overview, challenge description, download, guidelines, evaluate, and leaderboard pages for more details.

The COCO Stuff API is used to evaluate results of the Stuff Segmentation Challenge. For an overview of the relevant files, see this page. The software uses both candidate and reference segmentations, and applies the following evaluation metrics to leaf categories and super categories: Mean Intersection-Over-Union, Frequency Weighted Intersection-Over-Union, Mean Accuracy, Pixel Accuracy. More details can be found on the challenge homepage.

Please refer to COCO Terms of Use.

examples

Start: June 30, 2018, midnight

Description: The two example images provided in the COCO Stuff API repository. The results file can be created by running the pngToCocoResultDemo script. Evaluation should take approximately 2 minutes.

val2017

Start: June 30, 2018, midnight

Description: The val evaluation server for stuff segmentation. The submission has to include annotations for exactly 5,000 images. Evaluation should take approximately 7 minutes.

test-dev2017

Start: June 30, 2018, midnight

Description: The test-dev evaluation server for stuff segmentation with 20,288 images. Evaluation should take less than 20 minutes.

Competition Ends

Jan. 1, 2050, midnight

You must be logged in to participate in competitions.

Sign In
# Username Score
1 Stuff17-ResNeXt-FPN 0.5693
2 Stuff17-G-RMI 0.5326
3 Stuff17-Oxford-AVL 0.5193