Dear AIM 2019 challenge participant,
Here are the guidelines for the final test phase.
Due to some delays in providing you with the necessary details/data for the final test phase we postponed the deadline for submitting test results and descriptions/factsheets to September 11, 23:59 UTC.
You are considered participant in the final test phase if you have a valid entry in the online test server.
The test server provides flat results for all entries (all metrics are set to 0) and the final results will be computed offline by the organizers and released to the test phase participants after the end of the challenge.
The validation data and/or the validation server are meant only for validation and not for training.
In order your solution to be ranked and included in the report of the challenge and the authors of the solution to be invited to co-author the challenge report, you need to assure the reproducibility of the results, provide full test results, codes/executables and instructions, and a detailed description of your solution in a factsheet send by email.
The final submission should be emailed to all organizers of the respective challenge (see Learn the details -> organizers tab)
The title of the mail should follow the pattern: [COMPETITION_NAME] - [TEAM_NAME]
For example, for
COMPETITION_NAME = AIM 2019 Extreme Super-Resolution Challenge
TEAM_NAME = Superman
the email title is: [AIM 2019 Extreme Super-Resolution Challenge] - [Superman]
The body of the email shall include the following:
a) the challenge full name
b) team name
c) team leader's name and email address
d) rest of the team members
e) team members with AIM2019 sponsors
f) team name and user names on AIM2019 CodaLab competitions
g) executable/source code attached or download links
h) factsheet attached (you should use the updated factsheet of the competition!)
i) download link(s) to the FULL results of ALL of the test frames (corresponding to the last entry in the online Codalab test server and provided codes/exe)
The executable/source code should include trained models and/or necessary parameters so that we could run it and reproduce results. There should be a README or descriptions that explains how to execute the executable/code. Factsheet must be a compiled pdf file and a zip with the corresponding .tex source files. Please provide a detailed explanation.
Note that you can still submit to the validation server in those competitions for which the validation ground truth images were not released.
Please check carefully the competition pages (including "Participate->Get Data") and be sure that your are following the protocol, the right templates and provide the results for the correct test images.
Please use carefully the limited number of submissions to the Codalab server. Only the last entry will be used for considered for ranking.
As participant in the challenge you are also invited to submit a paper related to the challenge (by topic and/or using challenge data/results) to AIM 2019 workshop. For this please follow the instructions http://www.vision.ee.ethz.ch/aim19/ , the deadline is September 20th, 2019.
Note that writing a paper is not mandatory for getting ranked and invited to co-author the challenge report. Assuring the reproducibility of the results and providing a detailed factsheet of the proposed solution is a must.
Since we can't see the scores in test phase, we are not sure we successfully submit the test results. We've met the situation in development phase where we submitted a result but it failed to calculate the score.
Is there any way we could ensure the submission?
Thank you!Posted by: SenseSloMo @ Sept. 9, 2019, 3:30 a.m.
We do internally calculate PSNR and SSIM that are just not shown.
If your submission appears on the table, you can be assured it was evaluated successfully.