BRaTS 2012 - Multimodal Brain Tumor Segmentation Challenge

Organized by irjudson - Current server time: Nov. 14, 2018, 11:42 a.m. UTC

First phase

Testing
May 1, 2012, 7 a.m. UTC

End

Competition Ends
Oct. 2, 2012, 7 a.m. UTC

Background

Because of their unpredictable appearance and shape, segmenting brain tumors from multi-modal imaging data is one of the most challenging tasks in medical image analysis. Although many different segmentation strategies have been proposed in the literature, it is hard to compare existing methods because the validation datasets that are used differ widely in terms of input data (structural MR contrasts; perfusion or diffusion data; ...), the type of lesion (primary or secondary tumors; solid or infiltratively growing), and the state of the disease (pre- or post-treatment).

In order to gauge the current state-of-the-art in automated brain tumor segmentation and compare between different methods, we are organizing a Multimodal Brain Tumor Segmentation (BRATS) challenge in conjunction with the MICCAI 2012 conference. For this purpose, we are making available a large dataset of brain tumor MR scans in which the tumor and edema regions have been manually delineated. In addition, we also provide realistically generated synthetic brain tumor datasets for which the ground truth segmentation is known.

Challenge Format

Teams wishing to participate in the challenge should download the training data for algorithmic tweaking and tuning. The teams should then evaluate their segmentation performance on the training data, and submit a short paper describing the results and the segmentation method that was used. On the day of the challenge itself, an independent set of test scans will be made available and analyzed on the spot by each team, after which the methods will be ranked according to their performance. The challenge day will conclude with a round-table discussion of the obtained results as well as invited talks by clinical experts.

In the weeks following the challenge participating teams will be invited to contribute to a joint paper describing and summarizing the challenge outcome, which we will then submit to a high-impact journal in the field.

A FAQ with more details can be found here.

An online tool for comparing automated segmentations with reference ones is available from either the Virtual Skeleton Database evaluation page or from Kitware/MIDAS. The tool computes the following segmentation performance metrics for both edema and active tumor:

  • Dice
  • Jaccard
  • Sensitivity
  • Specificity
  • Average closest distance
  • Hausdorff distance
  • Cohen's kappa

Detailed instructions on how to use the evaluation tools are provided below.

Using the on-line evaluation tool at the Virtual Skeleton Database

Automated segmentations should be uploaded directly to the evaluation page. A user account is required - if you don't have one, you can create one from the BRATS2012 starting page. The following guidelines should be followed:

  • Use the MHA filetype to store your segmentations (not MHD)
  • Keep the same labels as the provided truth.mha: 1 for edema, 2 for active tumor, and 0 for everything else
  • In order to match your segmentations with the correct ground truth, name your segmentations according to this template: VSD.your_description.###.mha, where the ### is the ID of the corresponding FLAIR scan in the Virtual Skeleton Database. If you downloaded the training data from the Virtual Skeleton Database, then this ID will be apparent from the FLAIR file name in the subdirectory corresponding to the subject in question (e.g., file name VSD.Brain.XX.O.MR_Flair.687.mha corresponds to ID 687). Otherwise, please refer to the following table to match the subject (last column) with the correct FLAIR ID (second column): OpenDocument Spreadsheet file format, Microsoft Excel file format,, or PDF.

Using the on-line evaluation tool at Kitware/MIDAS

Please follow the detailed instructions at the Kitware/MIDAS site.

Submission Requirements

Teams wishing to participate in the challenge on 1 October 2012 should should submit a short paper describing their segmentation method, as well as the results obtained on the training data before 3 July 2012. Teams accepted to the on-site challenge will be then notified by the organizers before 6 July 2012.

Each teams' paper should be formatted in the LNCS style, and be at least 2-4 pages long. It should describe the method used, indicate whether or not human interaction is required, and include an estimate of the running time for a single patient (note that slow methods can also participate in the on-site challenge - see the FAQ for details). In addition, the paper should also include a table summarizing the segmentation performance on the training data, obtained using the online evaluation tool provided by the organizers. Since the reference segmentations of the training cases are made publicly available, it is the authors' own responsibility to design a cross-validation experiment (i.e., to test their method on other cases than the ones used to tweak and tune algorithmic settings), and to describe their experiments and results accordingly. The final ranking of the competing algorithms will be based only on their performance on the independent set of test scans that will be made available on-site!

Data License

The BRATS training and testing data are made freely available through the Creative Commons Attribution-NonCommercial 3.0 license. Please include the following language in any work using the BRATS data:

"Brain tumor image data used in this work were obtained from the MICCAI 2012 Challenge on Multimodal Brain Tumor Segmentation (http://www.imm.dtu.dk/projects/BRATS2012) organized by B. Menze, A. Jakab, S. Bauer, M. Reyes, M. Prastawa, and K. Van Leemput. The challenge database contains fully anonymized images from the following institutions: ETH Zurich, University of Bern, University of Debrecen, and University of Utah."

Please see the FAQ for additional details.

Testing

Start: May 1, 2012, 7 a.m.

Training

Start: Aug. 1, 2012, 7 a.m.

Challenge

Start: Oct. 1, 2012, 7 a.m.

Competition Ends

Oct. 2, 2012, 7 a.m.

You must be logged in to participate in competitions.

Sign In