BEA 2019 Shared Task: Grammatical Error Correction
Welcome to the BEA 2019 Shared Task on Grammatical Error Correction: Restricted Track. Please, check the shared task website if you are looking for a different track.
On this website, you will be able to evaluate your system output on the development set and make a submission for the test set when the test phase opens. After the shared task is finished, this website will remain open so that anyone can evaluate their system output and compare their performance with other teams.
The competition is composed of three phases: development, test and open.
During the development phase, participants will be able to evaluate their systems on the W&I+LOCNESS development set to help them develop their systems. This will also give participants the opportunity to become more familiar with the Codalab submission procedure in advance of the test phase. The development phase will remain open indefinitely and participants can make as many submissions as they like (up to 999). For each successful submission, participants will be able to see their overall and detailed results.
During the test phase, participants will be able to make up to 2 submissions per team on the official W&I+LOCNESS test set between 25th March 2019 00:00 UTC and 29th March 2019 23:59 UTC. This is the official evaluation phase of the shared task, so participants will not be able to see their results until after the end of the phase and the official results are announced. They can at least make sure that their submission status is "Finished" (i.e. successful) however.
During the open phase, participants will continue to be allowed to make submissions on the official W&I+LOCNESS test set, however these submissions will not be included in the official BEA2019 shared task results. The main aim of the open phase is to encourage future work into GEC beyond the shared task. The open phase will remain open indefinitely and participants can make as many submissions as they like (up to 999). In addition to being able to see their own overall and detailed results, participants will also be able to add their scores to the leaderboard if they want to make them publicly available.
Although we do not require participants to register for the shared task, Codalab enforces registration in order to participate. To do this, go to the "Participate" tab, accept the terms and conditions and click on "Register". Registration is automatic so you do not need to wait for approval (despite what Codalab says).
Before making a submission, we strongly recommend that you set up a name for your team. The easiest way to do this is to go to your Codalab profile settings and fill in the "Team name" field under the "Competition settings" section. Note, however, that this is a global team name that will be used for all the competitions you participate in.
If you prefer setting up a specific team name for this competition, you must first register as described above and then create a new team via the "Team" tab that will appear. This is a more advanced option that will let you use a logo and add members to your team, etc. If you make use of this option, please make sure you have properly set up your team name before making a submission, as it is not possible to assign a competition-specific team name to a submission after it has been made.
Full instructions on how to set up team names can be found here.
First of all, make sure that you have used the right input data for the phase you are submitting to. Links to the development and test data are provided at the top of this page. These files contain the tokenised input sentences that your system should correct. The output file produced by your system should contain corrected versions of the same sentences, also tokenised and one per line. For example:
|Input||Travel by bus is exspensive , bored and annoying .|
||Travelling by bus is expensive , boring and annoying .|
Input files were tokenised using spaCy v1.9.0 and the en_core_web_sm-1.2.0 model so we strongly recommend that you use the same settings for producing your output files.
In order to make a submission to this competition, you must first zip your output file. The name of the files is not important, but you should make sure that the zipped file only contains a single file. It should also not contain any directories or subdirectories.
To upload your submission zip file, go the the "Participate" tab and click on the phase you want to submit to. If the phase is open, you will see a form with a "Submit" button to upload your submission. You should be able to see the status of your submission at different stages. "Submitted" means your file has been uploaded but not evaluated yet. Do not worry if your submission remains in this state for a while, Codalab will eventually score it. During evaluation, the status will change to "Running". If evaluation is successful, the "Status" column will show "Finished", otherwise, it will show "Failed" and not count towards your total number of submissions.
If you are submitting a file during the development or open phase, you will be able to view your main F0.5 score ("Score" column), overall results ("View scoring output log"), detailed results ("View detailed results") and any error messages ("View scoring error log"). During the open phase, you can add your results to the public leaderboard by clicking on the "Submit to Leaderboard" button.
Start: March 4, 2019, midnight
Description: The development phase allows participants to evaluate their systems on the development set and make sure that their output is in the right format before the test phase begins.
Start: March 25, 2019, midnight
Description: The test phase is the official evaluation phase of the shared task. Submissions will be evaluated on the official W&I+LOCNESS test set. Participants will only be allowed to make a maximum of 5 submissions in total during this phase. Out of these 5 submissions, only the best (in terms of span-based correction F0.5) will be shown on the leaderboard at the end of the test phase.
Start: March 30, 2019, midnight
Description: The open phase allows people to continue submitting system output on the test set after the end of the shared task. This should facilitate fairer system comparison of future work.
You must be logged in to participate in competitions.Sign In