LiLT-RE-FR / README.md
kavg's picture
LiLT-RE-FR
0bf9a23 verified
metadata
license: mit
base_model: nielsr/lilt-xlm-roberta-base
tags:
  - generated_from_trainer
datasets:
  - xfun
metrics:
  - precision
  - recall
  - f1
model-index:
  - name: checkpoints
    results: []

checkpoints

This model is a fine-tuned version of nielsr/lilt-xlm-roberta-base on the xfun dataset. It achieves the following results on the evaluation set:

  • Precision: 0.3126
  • Recall: 0.6777
  • F1: 0.4278
  • Loss: 0.5651

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 10000

Training results

Training Loss Epoch Step F1 Validation Loss Precision Recall
0.2058 19.23 500 0 0.2763 0 0
0.145 38.46 1000 0.0623 0.2325 0.2889 0.0349
0.1441 57.69 1500 0.1232 0.2306 0.2616 0.0806
0.0902 76.92 2000 0.2645 0.2439 0.2526 0.2775
0.0768 96.15 2500 0.3176 0.3033 0.2440 0.4548
0.0707 115.38 3000 0.3472 0.3333 0.2778 0.4628
0.0649 134.62 3500 0.3509 0.3677 0.2629 0.5273
0.0257 153.85 4000 0.3705 0.4219 0.2810 0.5434
0.054 173.08 4500 0.3699 0.4440 0.2729 0.5739
0.0368 192.31 5000 0.3942 0.4843 0.3005 0.5730
0.0326 211.54 5500 0.3968 0.4651 0.2952 0.6052
0.0412 230.77 6000 0.4100 0.5386 0.3018 0.6392
0.0603 250.0 6500 0.4189 0.4957 0.3068 0.6598
0.0215 269.23 7000 0.4127 0.4768 0.2999 0.6616
0.0233 288.46 7500 0.4284 0.5245 0.3183 0.6553
0.0212 307.69 8000 0.4259 0.5424 0.3091 0.6849
0.0152 326.92 8500 0.4206 0.5655 0.3073 0.6661
0.0147 346.15 9000 0.4260 0.5630 0.3123 0.6697
0.0205 365.38 9500 0.4321 0.5389 0.3174 0.6768
0.0115 384.62 10000 0.4278 0.5651 0.3126 0.6777

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.1