mohamedtolba's picture
update model card README.md
edcd4c6
|
raw
history blame
3.18 kB
metadata
license: cc-by-4.0
base_model: Helsinki-NLP/opus-mt-tc-big-en-ar
tags:
  - generated_from_trainer
metrics:
  - bleu
model-index:
  - name: opus-mt-tc-big-en-ar-finetuned-franco-to-arabic-3
    results: []

opus-mt-tc-big-en-ar-finetuned-franco-to-arabic-3

This model is a fine-tuned version of Helsinki-NLP/opus-mt-tc-big-en-ar on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.8103
  • Bleu: 8.7335
  • Gen Len: 23.16

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 14 8.4807 0.0 403.72
No log 2.0 28 7.7371 0.0 189.68
No log 3.0 42 7.2123 0.1961 80.52
No log 4.0 56 6.8014 0.1764 100.08
No log 5.0 70 6.4285 0.2171 54.96
No log 6.0 84 6.1286 0.209 88.92
No log 7.0 98 5.8323 0.2334 26.0
No log 8.0 112 5.6044 0.2352 25.6
No log 9.0 126 5.3873 0.2435 25.84
No log 10.0 140 5.2068 0.2459 24.48
No log 11.0 154 5.0464 0.6902 23.68
No log 12.0 168 4.8825 0.7391 24.4
No log 13.0 182 4.7366 1.185 22.68
No log 14.0 196 4.5743 1.3994 23.6
No log 15.0 210 4.4676 1.9653 23.84
No log 16.0 224 4.3406 1.9566 23.88
No log 17.0 238 4.2305 2.1215 22.88
No log 18.0 252 4.1240 4.4593 22.28
No log 19.0 266 4.0583 3.2999 22.52
No log 20.0 280 3.9704 7.1094 23.12
No log 21.0 294 3.9151 7.3081 23.12
No log 22.0 308 3.8737 7.2132 23.72
No log 23.0 322 3.8393 9.0724 22.92
No log 24.0 336 3.8173 8.8528 23.0
No log 25.0 350 3.8103 8.7335 23.16

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.4
  • Tokenizers 0.13.3