NeurIPS 2020 Hide-and-Seek Privacy Challenge

Organized by tavianator - Current server time: Oct. 25, 2020, 8:18 p.m. UTC

Current

Development
June 15, 2020, midnight UTC

Next

Evaluation
Nov. 15, 2020, midnight UTC

End

Competition Ends
Nov. 30, 2020, midnight UTC

The clinical time-series setting poses a unique combination of challenges to data modeling and sharing. Due to the high dimensionality of clinical time series, adequate de-identification to preserve privacy while retaining data utility is difficult to achieve using common de-identification techniques. An innovative approach to this problem is synthetic data generation. From a technical perspective, a good generative model for time-series data should preserve temporal dynamics, in the sense that new sequences respect the original relationships between high-dimensional variables across time. From the privacy perspective, the model should prevent patient re-identification by limiting vulnerability to membership inference attacks. The NeurIPS 2020 Hide-and-Seek Privacy Challenge is a novel two-tracked competition to simultaneously accelerate progress in tackling both problems. In our head-to-head format, participants in the synthetic data generation track (i.e. “hiders”) and the patient re-identification track (i.e. “seekers”) are directly pitted against each other by way of a new, high-quality intensive care time-series dataset: the AmsterdamUMCdb dataset. Ultimately, we seek to advance generative techniques for dense and high-dimensional temporal data streams that are (1) clinically meaningful in terms of fidelity and predictivity, as well as (2) capable of minimizing membership privacy risks in terms of the concrete notion of patient re-identification.

For evaluation details, see paper introducing the challenge: https://arxiv.org/abs/2007.12087

  • The van der Schaar lab, Microsoft Research Cambridge and Amsterdam UMC employees can participate but are ineligible for prizes.
  • Participants that have access to the AmsterdamUMCdb dataset will be required to declare this and will be ineligible for prizes.
  • Any participants that are ineligible for prizes will not contribute to the scores of other teams.
  • To be eligible for the final scoring, participants are required to release the code of their submissions as open source.
  • If a submission does not run successfully it is the participants’ responsibility to debug it. Participants will be allowed to attempt to submit at most once per day.
  • Generation algorithms may only use the public data to define and tune hyper-parameters of their algorithm but may not use the public data to initialise/pre-train a model.
  • Each generative algorithm will be required to run within a specific time on a given GPU.
  • Each re-identification algorithm will be required to run within a specific time on a given GPU.

Development

Start: June 15, 2020, midnight

Description: Generation- and inference-track submissions.

Evaluation

Start: Nov. 15, 2020, midnight

Description: Final phase.

Competition Ends

Nov. 30, 2020, midnight

You must be logged in to participate in competitions.

Sign In
# Username Score
1 lumip 0.50
2 yingjialin 9.99
3 hns_baseline_binary_predictor 9.99