AIM 2020 Image Extreme Inpainting Challenge Track 2 Semantic Guidance Forum

Go back to competition Back to thread list Post in this thread

> Final test phase, submission of results, ranking - guidelines [challenge ends July 17, 23:59 UTC]

Dear AIM 2020 challenge participant,

Here are the guidelines for the final test phase.
Please check carefully "Learn the details" and "Participate->Get Data" sections of your specific competition & track and the provided templates and instructions.

You are considered participant in the final test phase if you have a valid entry in the online test server.
The test server provides flat results for all entries (all metrics are set to 0), the test leaderboard is hidden and the final results will be computed offline by the organizers and released to the test phase participants after the end of the challenge.
The validation data and/or the validation server are meant only for validation and not for training.

In order your solution to be ranked and included in the report of the challenge and the authors of the solution to be invited to co-author the challenge report, you need to assure the reproducibility of the results, provide full test results, codes/executables and instructions, and a detailed description of your solution in a factsheet send by email.

The final submission should be emailed to ALL organizers of the respective challenge (see "Learn the details -> organizers" tab)

The title of the mail should follow the pattern: [COMPETITION_NAME] - [TEAM_NAME]
For example, for
COMPETITION_NAME = AIM 2020 Learned Smartphone ISP Challenge - Track 1
TEAM_NAME = Superman
the email title is: [AIM 2020 Learned Smartphone ISP Challenge - Track 1] - [Superman]

The body of the email shall include the following:

a) the challenge full name
b) team name
c) team leader's name and email address (team's contact)
d) rest of the team members
e) team members with AIM2020 sponsors
f) team name and user names on AIM2020 CodaLab competitions
g) executable/source code attached or download links
h) factsheet attached (you should use the updated factsheet of the competition!)
i) download link(s) to the FULL results of ALL of the test frames/images (corresponding to the last entry in the online Codalab test server and provided codes/exe)
(NOTE that for some competitions it is critical to get the FULL image results as the entry on the online test server is limited to a subset of (cropped) image results)

The executable/source code should include trained models and/or necessary parameters so that we could run it and reproduce results. There should be a README or descriptions that explains how to execute the executable/code. Factsheet must be a compiled pdf file and a zip with the corresponding .tex source files. This would largely help compiling the final report. Please provide a detailed explanation.

Note that you can still submit to the validation server in those competitions for which the validation ground truth images were not released.

Please check carefully the competition pages (including "Participate->Get Data") and be sure that your are following the protocol, the right templates and provide the results for the correct test images.

Please use carefully the limited number of submissions to the Codalab server. Only the last entry will be considered for ranking.

As participant in the challenge you are also invited to submit a paper related to the challenge (by topic and/or using challenge data/results) to AIM 2020 workshop. For this please follow the instructions https://data.vision.ee.ethz.ch/cvl/aim20/ , the deadline is July 29th, 2020.
Note that writing a paper is not mandatory for getting ranked and invited to co-author the challenge report. Assuring the reproducibility of the results and providing a detailed factsheet of the proposed solution is a must.

Success,
Challenge organizers

Posted by: Radu @ July 11, 2020, 5:16 p.m.
Post in this thread