europarl / README.md
samzirbo's picture
End of training
d0f963d verified
metadata
base_model: samzirbo/mT5.en-es.pretrained
tags:
  - generated_from_trainer
metrics:
  - bleu
model-index:
  - name: baseline.europarl
    results: []

baseline.europarl

This model is a fine-tuned version of samzirbo/mT5.en-es.pretrained on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.5556
  • Bleu: 28.4134
  • Meteor: 0.548
  • Chrf++: 50.7588

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 1000
  • training_steps: 50000

Training results

Training Loss Epoch Step Validation Loss Bleu Meteor Chrf++
3.7368 0.06 2500 3.7330 16.6348 0.4263 38.9851
1.9501 0.11 5000 3.6988 20.5694 0.4632 42.6268
1.7374 0.17 7500 3.6554 21.8374 0.4825 44.4332
1.6252 0.23 10000 3.6674 23.1212 0.4954 45.8093
1.5421 0.28 12500 3.6407 24.144 0.5063 47.0076
1.4867 0.34 15000 3.6991 23.9884 0.5055 46.9973
1.4413 0.39 17500 3.6669 25.0356 0.5135 47.6982
1.4017 0.45 20000 3.5988 25.4766 0.5201 48.3754
1.3769 0.51 22500 3.6120 26.17 0.5295 49.037
1.3476 0.56 25000 3.6225 26.8343 0.5341 49.5501
1.3252 0.62 27500 3.5913 26.7117 0.5321 49.3981
1.307 0.68 30000 3.6205 27.3269 0.5385 49.9517
1.2926 0.73 32500 3.5624 27.7597 0.5446 50.3568
1.2823 0.79 35000 3.5449 27.8457 0.5458 50.5179
1.2728 0.85 37500 3.5383 27.9605 0.5444 50.3421
1.2663 0.9 40000 3.5556 28.1962 0.5465 50.5512
1.2628 0.96 42500 3.5498 28.3077 0.5477 50.7074
1.2514 1.01 45000 3.5519 28.3543 0.548 50.7669
1.239 1.07 47500 3.5555 28.2177 0.5473 50.6922
1.2357 1.13 50000 3.5556 28.4134 0.548 50.7588

Framework versions

  • Transformers 4.38.0
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2