gendered_man_woman / README.md
samzirbo's picture
End of training
134325d verified
metadata
base_model: samzirbo/mT5.en-es.pretrained
tags:
  - generated_from_trainer
metrics:
  - bleu
model-index:
  - name: gendered_man
    results: []

gendered_man

This model is a fine-tuned version of samzirbo/mT5.en-es.pretrained on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1748
  • Bleu: 43.9515
  • Meteor: 0.6913
  • Chrf++: 62.7752

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 1000
  • training_steps: 50000

Training results

Training Loss Epoch Step Validation Loss Bleu Meteor Chrf++
4.3337 0.26 2500 2.0225 27.9199 0.5539 49.0346
2.4254 0.53 5000 1.7244 33.3111 0.6023 54.0561
2.1742 0.79 7500 1.5831 35.9048 0.6263 56.4407
2.027 1.05 10000 1.5000 37.665 0.6418 57.9131
1.9113 1.32 12500 1.4375 38.1525 0.6462 58.1999
1.8536 1.58 15000 1.3846 39.6031 0.659 59.4201
1.8003 1.84 17500 1.3468 40.1249 0.6625 59.8142
1.7367 2.11 20000 1.3078 41.024 0.6678 60.4354
1.6845 2.37 22500 1.2849 41.5369 0.6721 60.9458
1.6609 2.64 25000 1.2589 41.9956 0.676 61.2321
1.6362 2.9 27500 1.2358 42.5127 0.6817 61.7016
1.5945 3.16 30000 1.2218 42.7584 0.6816 61.8378
1.5638 3.43 32500 1.2088 42.9875 0.6844 62.0687
1.5557 3.69 35000 1.1983 43.3948 0.6862 62.2546
1.5469 3.95 37500 1.1864 43.5866 0.6877 62.5001
1.5151 4.22 40000 1.1826 43.5886 0.6888 62.501
1.5081 4.48 42500 1.1783 43.8291 0.6898 62.6281
1.5035 4.74 45000 1.1764 43.7796 0.6904 62.6856
1.4997 5.01 47500 1.1752 43.8886 0.6908 62.7218
1.4894 5.27 50000 1.1748 43.9515 0.6913 62.7752

Framework versions

  • Transformers 4.38.0
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2