We are approaching the deadline for submitting the best results you had so far on the XMRec CUP@WSDM’22. We would like to remind you of a few important points as we go forward.
- All team members should register via the Team Registration Form (https://bit.ly/3DFc3JR).
- The last day for submitting your results is January 17, 2022.
- You are required to provide a code to the organizers that can validate your entry in the leaderboard. We will provide a reproducibility badge on the leaderboard for any team that sends us a working code and our tests show the exact same results as on the leaderboard. Fill the XMRec Code Submission Form for every submission that you want to share with us at https://bit.ly/3EuHKGm. We highly encourage the participants to share their code by adding our Github account (@xmrec) to their private Github repository, so that they can make their repository public after the competition ends.
- We kindly ask you to start working on cleaning a working system and share it with us in advance. Please don’t hesitate to reach us with any questions regarding this step.
- You are also required to provide a paragraph of text along with your implementation submission describing the abstract of your solution. Please use up to 500 words for your abstract.
- Please note that any team failing to provide a validating implementation would lose its stance in the leaderboard. Also, the top teams are required to send us a working system, as well as a technical report to be eligible for the prize.
The final deadline for sending your code is one week after the last day for submitting your results, January 24, 2022.
Given that the implementation sharing step can be time-consuming on both ends with possible back-and-forth interactions, we are calling every team to send us their best working system early and get some feedback for the final system submission. Please note that for your final implementation submissions, you can only send a maximum of two systems for performance and research value purposes.
What to send? We are asking for the implementation to be sent as a Python script, assuming that the data is read from the “DATA” directory in the same format that is available on the starting kit. Each run should be accompanied by a Docker image that contains all the required packages. We encourage the teams to push their Docker images on the Docker Hub. Moreover, each submission should have a README file that contains information on how the code should be executed for both training and validation, together with the name of the Docker image. Please notice that we will retrain the models and validate them on the test sets. Therefore, no pre-trained models are required. Also, note that the system will not have access to the Internet while training / validating the model. Therefore, all the requirements should already be included in the Docker image.
System configuration: the submitted systems must produce the output in a reasonable time window on our system with the following configuration:
- CPU: Intel(R) Xeon(R) CPU E5-2630 v3 @ 2.40GHz
- RAM: 32 GB
- GPU: GeForce GTX TITAN X
Below, please find information about the badges that we will add to the runs on the final leaderboard:
- Reproducibility: This badge indicates that the organizers managed to train and test the submission, and achieve the same results indicated on the leaderboard.
- Public: This badge indicates that the submission’s code is publicly available.
- Cross-market: This badge indicates that the submission utilizes the cross-market data provided by the organizers.
- External data: this badge indicates that the submission utilizes external data or pre-trained models, other than the data that was provided by the organizers.
- Academic: This badge indicates that the team is from an academic group (rather than industry).
- Team Registration Form: https://bit.ly/3DFc3JR
- Model Submission Form: https://bit.ly/3EuHKGm