xlm-roberta-longformer-base-16384
⚠️ This is just the PyTorch version of hyperonym/xlm-roberta-longformer-base-16384
without any modifications.
xlm-roberta-longformer is a multilingual Longformer initialized with XLM-RoBERTa's weights without further pretraining. It is intended to be fine-tuned on a downstream task.
The notebook for replicating the model is available on GitHub: https://github.com/hyperonym/dirge/blob/master/models/xlm-roberta-longformer/convert.ipynb
- Downloads last month
- 2,199
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.