Feature Selection Challenge

Organized by lukasz.romaszko - Current server time: Nov. 14, 2018, 10:54 a.m. UTC

Current

Phase test
Feb. 1, 2015, midnight UTC

End

Competition Ends
Never

Overview

 

 
This challenge mimics the NIPS 2003 Feature Selection Challenge: it allows you to make post-challenge submissions on test data to benchmark new methods. The challenge ended on December 12, 2003.
NIPS   Clopinet

Go here for more details (an archive of the original website):
http://web.archive.org/web/20130512034606/http://www.nipsfsc.ecs.soton.ac.uk/datasets

The Challenge

The aim of the challenge in feature selection is to find feature selection algorithms that significantly outperform methods using all features, on ALL five benchmark datasets. To facilitate entering results for all five datasets, all tasks are two-class classification problems. You can download the datasets in the Participate section.

DatasetSizeTypeFeaturesTraining ExamplesValidation ExamplesTest Examples
Arcene 8.7 MB Dense 10000 100 100 700
Gisette 22.5 MB Dense 5000 6000 1000 6500
Dexter 0.9 MB Sparse integer 20000 300 300 2000
Dorothea 4.7 MB Sparse binary 100000 800 350 800
Madelon 2.9 MB Dense 500 2000 600 1800

The score is an average of AUC scores of each of the datasets.

Submission Description

New format:

You can submit just one file .predict instead of .resu and .conf. Each .predict file should contain in each line the respective confidence that the label is positive.
The BER metric will be also calculated, tresholding at 0.

The submission should consist of 5 .predict files (e.g.: arcene_test.predict) in a zip archive, without extra directories. You can optionally include .feat files. This sample submission contains random results.

Old format:

Submission in a format of the original contest should consist of 3 x 5 files in a zip archive:
datasetname_test.resu
datasetname_test.conf
datasetname.feat

See the sample submission in the original format.


The AUC scores are on the leaderboard, you can view the BER scores and feature statistics of your submission by clicking: 'View scoring output log'


Challenge Rules

  • General Terms: This challenge is governed by the General ChaLearn Contest Rule Terms, the Codalab Terms and Conditions, and the specific rules set forth.
  • Announcements: To receive announcements and be informed of any change in rules, the participants must provide a valid email.
  • Conditions of participation: Participation requires complying with the rules of the challenge. Prize eligibility is restricted by US government export regulations, see the General ChaLearn Contest Rule Terms. The organizers, sponsors, their students, close family members (parents, sibling, spouse or children) and household members, as well as any person having had access to the truth values or to any information about the data or the challenge design giving him (or her) an unfair advantage, are excluded from participation. A disqualified person may submit one or several entries in the challenge and request to have them evaluated, provided that they notify the organizers of their conflict of interest. If a disqualified person submits an entry, this entry will not be part of the final ranking and does not qualify for prizes. The participants should be aware that ChaLearn and the organizers reserve the right to evaluate for scientific purposes any entry made in the challenge, whether or not it qualifies for prizes.

This challenge is brought to you by ChaLearn. Contact the organizers.

Phase test

Start: Feb. 1, 2015, midnight

Description: test

Competition Ends

Never

You must be logged in to participate in competitions.

Sign In
# Username Score
1 brais 1.2000
2 JorgeSilva 4.8000
3 ehsan.variani 5.0000