Paper | Code | F1 | ModelName | ReleaseDate |
---|---|---|---|---|
Character-level Representations Improve DRS-based Semantic Parsing Even in the Age of BERT | ✓ Link | 88.3 | Bi-LSTM seq2seq: BERT + characters in 1 encoder | 2020-11-09 |
Discourse Representation Structure Parsing with Recurrent Neural Networks and the Transformer Model | 87.1 | Transformer seq2seq | 2019-05-01 | |
Linguistic Information in Neural Semantic Parsing with Multiple Encoders | 86.8 | Character-level bi-LSTM seq2seq + linguistic features | 2019-05-01 | |
Exploring Neural Methods for Parsing Discourse Representation Structures | ✓ Link | 83.3 | Character-level bi-LSTM seq2seq | 2018-10-30 |
Semantic Graph Parsing with Recurrent Neural Network DAG Grammars | 76.4 | Neural graph-based system using DAG-grammars | 2019-09-30 | |
Transition-based DRS Parsing Using Stack-LSTMs | 74.4 | Transition-based Stack-LSTM | 2019-05-01 |