Edit model card

XLM-EusBERTa-topic-classification

This model is a fine-tuned version of ClassCat/roberta-small-basque on the basque_glue dataset. It achieves the following results on the evaluation set:

  • Loss: 4.2158
  • Accuracy: 0.6494
  • F1: 0.6433
  • Precision: 0.6447
  • Recall: 0.6494

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
1.2439 1.0 1074 1.1310 0.6581 0.6316 0.6139 0.6581
0.9539 2.0 2148 1.3019 0.6117 0.6034 0.6465 0.6117
0.579 3.0 3222 1.5533 0.6645 0.6524 0.6661 0.6645
0.3766 4.0 4296 2.3287 0.6381 0.6283 0.6590 0.6381
0.2641 5.0 5370 2.2805 0.6597 0.6515 0.6707 0.6597
0.1707 6.0 6444 2.6621 0.6397 0.6399 0.6581 0.6397
0.1537 7.0 7518 2.9116 0.6408 0.6336 0.6452 0.6408
0.0867 8.0 8592 3.1775 0.6344 0.6337 0.6531 0.6344
0.0779 9.0 9666 3.2514 0.6543 0.6471 0.6593 0.6543
0.0587 10.0 10740 3.3244 0.6457 0.6424 0.6488 0.6457
0.0322 11.0 11814 3.8090 0.6214 0.6244 0.6488 0.6214
0.0139 12.0 12888 3.8642 0.6247 0.6176 0.6424 0.6247
0.0256 13.0 13962 3.8734 0.6419 0.6327 0.6398 0.6419
0.0046 14.0 15036 4.0934 0.6365 0.6330 0.6463 0.6365
0.0036 15.0 16110 4.0890 0.6484 0.6416 0.6469 0.6484
0.0023 16.0 17184 4.0978 0.6505 0.6440 0.6470 0.6505
0.0008 17.0 18258 4.1709 0.6478 0.6418 0.6449 0.6478
0.0014 18.0 19332 4.1715 0.6505 0.6446 0.6458 0.6505
0.0007 19.0 20406 4.2158 0.6489 0.6427 0.6443 0.6489
0.0039 20.0 21480 4.2158 0.6494 0.6433 0.6447 0.6494

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.0
  • Tokenizers 0.15.0
Downloads last month
8
Safetensors
Model size
51.4M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for IParraMartin/XLM-EusBERTa-topic-classification

Finetuned
(2)
this model

Evaluation results