Evaluating grammatical error corrections

Organized by cnapoles - Current server time: Dec. 6, 2022, 1:46 a.m. UTC

Current

Evaluation
Nov. 23, 2016, midnight UTC

End

Competition Ends
Never

Evaluation

Start: Nov. 23, 2016, midnight

Description: Evaluate corrections of the CoNLL-2014 shared task test set with GLEU, GLEU interpolated with LT (lambda optimized to Spearman's rho), and M2. GLEU and M2 are calculated using all available references. View the scoring output log to see the scores from other metrics and reference sets. If your submission fails, please make sure that you have uploaded a zipped file containing answer.txt. Occasionally, CodaLab raises a permissions error and the CodaLab developers do not know why this happens. If you receive this error, please resubmit and try again (or contact the organizers to rerun the scorer over your existing submission). We apologize for this inconvenience!

Competition Ends

Never

You must be logged in to participate in competitions.

Sign In