2018 Visual Domain Adaptation (VisDA2018) Open-Set Classification Challenge

Organized by nkaushik - Current server time: Dec. 10, 2018, 1:41 p.m. UTC

Previous

Testing Data Released
Aug. 1, 2018, midnight UTC

Current

Testing Data Released
Aug. 1, 2018, midnight UTC

End

Competition Ends
Aug. 28, 2018, 5 a.m. UTC

It is well known that the success of machine learning methods on visual recognition tasks is highly dependent on access to large labeled datasets. Unfortunately, performance often drops significantly when the model is presented with data from a new deployment domain which it did not see in training, a problem known as dataset shift. The VisDA challenge aims to test domain adaptation methods’ ability to transfer source knowledge and adapt it to novel target domains.

For details and instructions on how to participate, please visit the VisDA challenge website, where you can download the datasets and development kits. This challenge includes two tracks:

Participants are welcome to enter in one or both tracks.

For evaluation metrics and instructions on how to format submissions, please see the challenge ReadMe.

You will have an option to make your results private or public after submission. The leaderboard will show your CodaLab username and your team name. You may use one or multiple accounts to submit for one team, so long as your team affiliation is clear when submitting the results. Consult CodaLab's instructions for forming user teams at either at the profile or competition-level. Your team must as a whole adhere to daily and total submission limits specified in the "Participate" section.

The main leaderboard shows results of adapted models and will be used to determine the final team ranks. The expanded leaderboard additionally shows the team's source-only models, i.e. those trained only on the source domain without any adaptation. These results are useful for estimating how much the method improves upon its source-only model, but will not be used to determine team ranks.

Training and Validation Data Released

Start: May 19, 2018, midnight

Testing Data Released

Start: Aug. 1, 2018, midnight

Competition Ends

Aug. 28, 2018, 5 a.m.

You must be logged in to participate in competitions.

Sign In