The shared task on DRS parsing will be collocated with IWCS-2019 held in Gothenburg, Sweden on 23-27 May.
DRS parsing is a semantic parsing task where meaning of natural language texts needs to be automatically converted into a Discourse Representation Structure (DRS), a semantic representation with a long history in studies on formal semantics.
A Discourse Representation Structure (DRS) is a meaning representation introduced by the Discourse Representation Theory (DRT), a well-studied formalism developed in formal semantics (Kamp, 1984; Van der Sandt, 1992; Kamp and Reyle, 1993; Asher, 1993; Kadmon, 2001).
DRSs are able to model many challenging semantic phenomena, for example, quantifiers, negation, pronoun resolution, presupposition accommodation, and discourse structure. Concepts are represented by WordNet synsets, relations by VerbNet roles.
L. Abzianidze, J. Bjerva, K. Evang, H. Haagsma, R. van Noord, P. Ludmann, D. Nguyen, J. Bos (2017): The Parallel Meaning Bank: Towards a Multilingual Corpus of Translations Annotated with Compositional Meaning Representations. EACL. [PDF] [BibTeX]
Competition submissions will be evaluated by computing the micro-average F-score on matching clauses of system output and the gold standard.
An animated illustration of the evaluation procedure (restart the animation with shift+reload_page):
As you can see, before the clause-matching procedure, the CLF referee checks whether system-produced clausal forms (CLFs) are well-formed and replaces ill-formed CLFs with a dummy never-matching CLF.
So, you better find a better fix for the ill-formed CLFs of your system before the submission.
By participating in this shared task you agree to the following terms and conditions. If any of these conditions is violated by a participant, the task organizers reserve the right to ban the participant from the task and withheld the scores of their systems.
All you need for a quick start is available in the DRS parsing repository.
In short, the repository contains:
Contact the organizers by joining the discussion group.
Participants need to submit a zip file with a single file clfs.txt inside it. The clfs.txt file must be encoded in UTF-8, contain a list of DRSs separated by a blank line and follow the order in the corresponding raw file.
Each DRS has to be formatted in a clause notation by representing it as a set of clauses (the order and repetitions do not matter). Each line contains one clause. Comments start with a %, and any information after this sign is considered to be not part of a clause (see gold data for examples).
A quick anatomy of a clausal form of a sample DRS:
A clause is a triple or a quadruple where its components are separated with a whitespace.
The first component of a clause is always a variable standing for a box label (remember the box format of a DRS!) and the second component is always a functor that determines the type of the clause. The rest of the components are either variables or constants (the latter enclosed in double quotes).
For more details:
To facilitate communication between the task organizers and participants, we set up a discussion group at slack:
Go to the above link, create a slack account by simply typing your email and a password, and you are ready to go.
We created several channels to keep the communication tidy:
#codalab for the issues related to this competition website, e.g., problems with registration or submission;
#general basically for everything that is related to the shared task and is not specific to the CodaLab site;
#random for conversations not really related to the shared task as such, e.g., talks about weather, food or football.
|20 Dec 2018||Final data released (before the competition phase)|
|25-28 Feb 2019||Competition phase|
|04 Mar 2019||Results posted|
|15 Mar 2019||System description paper due by 11:59pm UTC-12|
|01 April 2019||Notification of acceptance|
|15 April 2019||Camera-ready due|
|23-27 May 2019||IWCS main conference|
You can get the data from the Get Data page under the Participate tab.
Go to your submit/results page, click . of the score you want to show, and press [Submit to Leaderboard]. The . sign will appear for the submitted score.
Start: Oct. 1, 2018, midnight
Description: During the pre-competition phase you can make a lot of submissions of your system output. After submitting a system output, you should see either a score or an error produced by the evaluation script. The same evaluation script will be used in the competition phase.
Start: Feb. 25, 2019, midnight
Start: March 1, 2019, midnight
You must be logged in to participate in competitions.Sign In