Edit model card

xlm-roberta-large-cantemist

This model is a finetuned version of xlm-roberta-large for the cantemist dataset used in a benchmark in the paper TODO. The model has a F1 of 0.904

Please refer to the original publication for more information TODO LINK

Parameters used

parameter Value
batch size 16
learning rate 2e05
classifier dropout 0.1
warmup ratio 0
warmup steps 0
weight decay 0
optimizer AdamW
epochs 10
early stopping patience 3

BibTeX entry and citation info

TODO
Downloads last month
9
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train IIC/xlm-roberta-large-cantemist

Collection including IIC/xlm-roberta-large-cantemist

Evaluation results