--- license: apache-2.0 datasets: - asset - wi_locness - GEM/wiki_auto_asset_turk - discofuse - zaemyung/IteraTeR_plus - jfleg language: - en metrics: - sari - bleu - accuracy --- # Model Card for CoEdIT-Large This model was obtained by fine-tuning the corresponding google/flan-t5-large model on the CoEdIT dataset. Details of the dataset can be found in our paper and repository. **Paper:** CoEdIT: ext Editing by Task-Specific Instruction Tuning **Authors:** Vipul Raheja, Dhruv Kumar, Ryan Koo, Dongyeop Kang ## Model Details ### Model Description - **Language(s) (NLP)**: English - **Finetuned from model:** google/flan-t5-large ### Model Sources [optional] - **Repository:** https://github.com/vipulraheja/coedit - **Paper [optional]:** [More Information Needed] ## How to use We make available the models presented in our paper.
Model | Number of parameters |
---|---|
CoEdIT-large | 770M |
CoEdIT-xl | 3B |
CoEdIT-xxl | 11B |