🐑🐑 PECoRe @ ICLR 2024
Resources for the paper "Quantifying the Plausibility of Context Reliance in Neural Machine Translation" (Sarti et al. 2024) published in ICLR 2024
Paper • 2310.01188 • Published • 1Note Published version: https://openreview.net/forum?id=XTHfNGI3zT
Running on Zero13🐑 🐑PECoRe
Analyze context usage in LM generations with model internals
Note Demo showcasing PECoRe usage with the `inseq attribute-context` CLI for decoder-only and encoder-decoder models.
gsarti/iwslt2017_context
Viewer • Updated • 5.55M • 151 • 1Note IWSLT 2017 dataset with document-level IDs. The English-French portion was used for context-aware MT training.
inseq/scat
Updated • 188 • 1Note SCAT+ dataset used for further fine-tuning and evaluation on anaphoric pronouns
inseq/disc_eval_mt
Updated • 61Note DiscEval-MT dataset used for evaluation on anaphora resolution and lexical choice
Helsinki-NLP/opus-mt-en-fr
Translation • Updated • 243k • 42Note Opus MT Small (default)
context-mt/scat-marian-small-ctx4-cwd1-en-fr
Translation • Updated • 143Note Opus MT Small, Source context only
context-mt/scat-marian-small-target-ctx4-cwd0-en-fr
Translation • Updated • 77 • 1Note Opus MT Small, Source and target-side contexts
Helsinki-NLP/opus-mt-tc-big-en-fr
Translation • Updated • 2.98k • 4Note Opus MT Big (default)
context-mt/scat-marian-big-ctx4-cwd1-en-fr
Translation • Updated • 13Note Opus MT Big, Source context only
context-mt/scat-marian-big-target-ctx4-cwd0-en-fr
Translation • Updated • 8Note Opus MT Big, Source and target-side contexts
facebook/mbart-large-50-one-to-many-mmt
Text2Text Generation • Updated • 8.53k • 35Note mBART 1-to-50 (default)
context-mt/scat-mbart50-1toM-ctx4-cwd1-en-fr
Translation • Updated • 11Note mBART 1-to-50, Source context only
context-mt/scat-mbart50-1toM-target-ctx4-cwd0-en-fr
Translation • Updated • 16Note mBART 1-to-50, Source and Target-side contexts