mohamedtolba's picture
update model card README.md
b3ffd36
|
raw
history blame
3.24 kB
metadata
license: cc-by-4.0
base_model: Helsinki-NLP/opus-mt-tc-big-en-ar
tags:
  - generated_from_trainer
metrics:
  - bleu
model-index:
  - name: opus-mt-tc-big-en-ar-finetuned-franco-to-arabic-3
    results: []

opus-mt-tc-big-en-ar-finetuned-franco-to-arabic-3

This model is a fine-tuned version of Helsinki-NLP/opus-mt-tc-big-en-ar on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.0027
  • Bleu: 9.3993
  • Gen Len: 14.8889

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 21 8.1233 0.0 217.5833
No log 2.0 42 7.3522 0.1587 139.6111
No log 3.0 63 6.7128 0.2113 63.9722
No log 4.0 84 6.2283 0.2611 25.7222
No log 5.0 105 5.7868 0.2545 27.3333
No log 6.0 126 5.4596 0.4153 25.75
No log 7.0 147 5.1424 0.8436 12.6944
No log 8.0 168 4.8195 0.8365 13.3889
No log 9.0 189 4.6181 1.2564 12.3889
No log 10.0 210 4.3480 1.7124 12.9722
No log 11.0 231 4.1206 3.2404 13.4167
No log 12.0 252 3.8982 2.4371 14.4167
No log 13.0 273 3.7679 3.9697 14.2778
No log 14.0 294 3.6359 4.7946 13.8889
No log 15.0 315 3.4940 5.9909 14.75
No log 16.0 336 3.3826 8.193 14.5278
No log 17.0 357 3.3084 7.5299 14.3056
No log 18.0 378 3.2302 8.5741 14.5833
No log 19.0 399 3.1751 5.9148 14.8611
No log 20.0 420 3.1242 10.389 14.3889
No log 21.0 441 3.0618 10.5285 14.6944
No log 22.0 462 3.0439 11.3953 14.6667
No log 23.0 483 3.0161 10.9885 14.8889
4.3311 24.0 504 3.0184 9.6143 14.7778
4.3311 25.0 525 3.0027 9.3993 14.8889

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.4
  • Tokenizers 0.13.3