ICDAR 2021 On-Line Signature Verification Competition (SVC 2021)

Organized by BiDA - Current server time: Nov. 30, 2020, 6:05 p.m. UTC
Reward $300

Current

Validation
Nov. 15, 2020, midnight UTC

Next

Evaluation
March 15, 2021, midnight UTC

End

Competition Ends
March 17, 2021, midnight UTC

OVERVIEW

The goal of the ICDAR 2021 On-Line Signature Verification Competition (SVC 2021) is to evaluate the limits of on-line signature verification systems using large-scale public databases and popular scenarios (office/mobile), and the stylus/finger as writing input. On-line signature verification technology is evolving fast in the last years due to several factors such as: i) the evolution in the acquisition technology going from the original Wacom devices specifically designed to acquire handwriting and signature in office-like scenarios through a pen stylus to the current touch screens of mobile scenarios in which signatures can be captured anywhere using our own personal smartphone through the finger, and ii) the extended usage of deep learning technology in many different areas, overcoming traditional handcrafted approaches and even human performance.

Therefore, in this competition the goal is to carry out a benchmark evaluation of the latest on-line signature verification technology using large-scale public databases such as DeepSignDB and both traditional office-like scenarios (pen stylus), but also the challenging mobile scenarios with signatures performed using the finger over a touch screen. SVC 2021 competition will provide a complete panorama of the state of the art in the on-line signature verification field under realistic scenarios.

TASKS

Task 1: Analysis of office scenarios using the stylus as input.

Task 2: Analysis of mobile scenarios using the finger as input.

Task 3: Analysis of both office and mobile scenarios simultaneously.

In addition, both random and skilled forgeries will be considered in each Task.

🅰🆆🅰🆁🅳🆂

If enough interest is received from the community (more than 5 different participants beating the baseline algorithms), then the winner of the SVC 2021 competition will receive a monetary award of 300 EUR (as an Amazon gift card). In addition, a selection of the best on-line signature verification systems will have the opportunity to take part as co-authors in a joint paper describing the SVC 2021 competition results.

REGISTRATION

The platform used in the SVC 2021 competition is CodaLab. Participants need to register to take part in the competition. Please, follow the instructions:

1) Fill up this form including your information.

2) Sing up in CodaLab using the same email introduced in step 1).

3) Join in CodaLab to the ICDAR 2021 On-Line Signature Verification Competition. Just click in the “Participate” tab for the registration.

Anonymous participants: participants are allowed to decide at the end of the competition whether they would like to include their names and affiliations in the competition report or not. Nevertheless, the organizers might include the results and some information about the systems but always completely anonymized.

EVALUATION CRITERIA

The SVC 2021 competition is going to follow a classification based on points. Each Task is going to be evaluated separately, having three winners for each Task with their corresponding points (gold medal: 3, silver medal: 2, and bronze medal: 1). The participant/team that gets more points in total (Task 1, 2, and 3) will be the winner of the competition.

The evaluation metric considered will be the popular Equal Error Rate (%).

EXPERIMENTAL PROTOCOL

Development:

  • Training: the organizers will provide the participants with the DeepSignDB database after sending us the corresponding license agreements (see details here). In addition, the participants can freely use other databases to train their systems.
  • Validation: the organizers will allow the participants to test their trained systems in CodaLab using a validation dataset similar to the final evaluation. This validation dataset (together with the signature comparisons files of each Task) will be sent to the authors (without ground-truth) after receiving the corresponding license agreements of DeepSignDB, as it is based on the evaluation dataset of DeepSignDB (442 users). In this stage of the competition, participants are allowed to submit their validation results up to 300 times (for all Tasks together). Results will be updated in CodaLab in real-time following an on-going competition.

Final Evaluation: the organizers will ask the participants to submit their final evaluation results using CodaLab.The final evaluation dataset (together with the final signature comparisons files) will be sent to the authors (without ground-truth) at the end of the competition. Participants are allowed to submit the results achieved by up to 3 different signature verification approaches/models. No feedback of the results achieved will be provided in this stage of the competition. The organizers will finally evaluate the participants’ files after the submission deadlines, using the ground-truth. The results and ground-truth dataset will be released a few days after the end of the competition (avoiding overfitting and malicious behaviors).

The validation and evaluation protocols will consider the two types of forgeries (random and skilled forgeries) together, so the systems should be robust against them together.

For more information regarding the submission process with CodaLab, please read the Submission Format tab.

The organizers might need to verify the truthfulness of the scores submitted if necessary.

ON-GOING RESULTS

As commented before, this is an on-going competition. Participants can test their developed systems using the validation dataset (without ground-truth) and corresponding signature comparison files through the CodaLab platform of the competition. Validation results are updated in CodaLab in real-time!

SCHEDULE

Tentative dates:

15th November 2020: Beginning of the competition. Registration starts and development data release.

8th March 2021: Registration deadline.

15th March 2021: Final Evaluation data release (without ground-truth).

17th March 2021: End of the competition. Results Submission.

24th March 2021: Notification of the results. Ground-truth release.

25th March 2021: Selection and notification of the best systems to take part in a joint paper.

SUBMISSION FORMAT

VALIDATION PHASE

A valid submission for CodaLab is a zip-compressed file including the .txt files containing the predictions made for each task you want to participate in (i.e., one .txt file per task).

The signature comparison files (one .txt file per task) provided together with the validation dataset must be used to obtain the predictions.

Note that even if you upload results from multiple submissions on to the leaderboard, only your latest submission is displayed on the leaderboard. During the validated period, each team can submit as many as 300 submissions (for all 3 tasks together). However, only the final submission will be considered as the official submission to the competition. (Make sure to upload it to the leaderboard.) This means that your final submission must have your entries for all the tasks you want to participate in.

Submitted .txt files included in a zip-compressed file must have the following nomenclature:

  • Task 1: “task1_predictions.txt”
  • Task 2: “task2_predictions.txt”
  • Task 3: “task3_predictions.txt”

 

In case you want to participate only in one task (e.g., task 1), submit the zip-compressed file including only the .txt file associated to that task (e.g., task1_predictions.txt).

Finally, in each prediction .txt file, we expect to have one prediction per row (column format) and with the same length as the number of comparisons included in the signature comparison files provided (one .txt per task). The following links provide:

  • Examples of the possible comparison files that we will include inside the validation dataset [link]. The first column represents the enrolment signature and the second one the test signature.
  • The zip-compressed file expected in CodaLab, with the .txt prediction files [link].

Final Evaluation Phase

The submission procedure followed in the final evaluation phase will be very similar to the validation stage. The final evaluation dataset (together with the final signature comparisons files) will be sent to the authors (without ground-truth) at the end of the competition. Participants are allowed to submit the results achieved by up to 3 different signature verification approaches/models. In order to do that, the participants should submit a single zip-compressed file with up to three folders, including in each folder the predictions .txt files associated to each signature verification approach/model. The following link provide an example of the final zip-compressed file expected [link].

ORGANIZERS

                                                                    

Biometrics and Data Pattern Analytics - BiDA Lab (UAM)

 

 

CONTACT

For further information, please contact: svc2021.contact@gmail.com

Web Admin and Local Arrangements: Santiago Rengifo de la Cruz

 

PUBLIC FUNDING

           

SPONSOR