Text2Text Generation
Transformers
PyTorch
bart
feature-extraction
abertsch's picture
Update README.md
23e541d
|
raw
history blame
1.24 kB
metadata
datasets:
  - ccdv/govreport-summarization
  - urialon/gov_report_validation
  - urialon/gov_report_test
pipeline_tag: text2text-generation
inference: false

Model from the preprint Unlimiformer: Long-Range Transformers with Unlimited Length Input

This is a BART-base model finetuned using Unlimiformer-aware early stopping, as described in section 3.1 of the paper. The model was finetuned on GovReport using the data processing pipeline from SLED; to load the validation or test set for use with these model, please use the datasets urialon/gov_report_validation and urialon/gov_report_test.

This is generally a weaker model than the alternating-training model and a stronger model than the baseline.

The inference demo is disabled because you must add the Unlimiformer files to your repo before this model can handle unlimited length input! See the Unlimiformer GitHub for setup instructions.