OpenCodePapers

question-answering-on-cnn-daily-mail

Question Answering
Dataset Link
Results over time
Click legend items to toggle metrics. Hover points for model names.
Leaderboard
PaperCodeCNNDaily MailModelNameReleaseDate
Linguistic Knowledge as Memory for Recurrent Neural Networks78.6GA+MAGE (32)2017-03-07
Gated-Attention Readers for Text Comprehension✓ Link77.980.9GA Reader2016-06-05
A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task✓ Link77.679.2Attentive + relabling + ensemble2016-06-09
Bidirectional Attention Flow for Machine Comprehension✓ Link76.979.6BiDAF2016-11-05
Iterative Alternating Neural Attention for Machine Reading✓ Link76.1AIA2016-06-07
Text Understanding with the Attention Sum Reader Network✓ Link75.477.7AS Reader (ensemble model)2016-03-04
ReasoNet: Learning to Stop Reading in Machine Comprehension74.776.6ReasoNet2016-09-17
Attention-over-Attention Neural Networks for Reading Comprehension✓ Link74.4AoA Reader2016-07-15
Natural Language Comprehension with the EpiReader74EpiReader2016-06-07
Dynamic Entity Representation with Max-pooling Improves Machine Reading 72.9Dynamic Entity Repres. + w2v2016-06-01
A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task✓ Link72.475.8AttentiveReader + bilinear attention2016-06-09
Text Understanding with the Attention Sum Reader Network✓ Link 69.573.9AS Reader (single model)2016-03-04
Teaching Machines to Read and Comprehend✓ Link69.4MemNNs (ensemble)2015-06-10
A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task✓ Link67.968.3Classifier2016-06-09
Teaching Machines to Read and Comprehend✓ Link63.868.0Impatient Reader2015-06-10
Teaching Machines to Read and Comprehend✓ Link6369Attentive Reader2015-06-10