metadata
datasets:
- abertsch/booksum-fullbooks
pipeline_tag: text2text-generation
Model from the preprint Unlimiformer: Long-Range Transformers with Unlimited Length Input.
This model was finetuned from a BART-base model using the retrieval-augmented training strategy described in section 3.2 of the paper. It was finetuned on the dataset BookSum (full-book setting).