cherifkhalifah's picture
Upload tokenizer
8f1c728 verified
metadata
base_model: Helsinki-NLP/opus-mt-en-ar
license: apache-2.0
metrics:
  - bleu
tags:
  - generated_from_trainer
model-index:
  - name: Tounsify-v0.10-shuffle
    results: []

Tounsify-v0.10-shuffle

This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-ar on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5449
  • Bleu: 35.7955
  • Gen Len: 9.4516

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 62 2.2061 11.8171 9.5161
No log 2.0 124 1.5962 21.0419 9.3065
No log 3.0 186 1.2980 24.4645 9.1935
No log 4.0 248 1.2160 28.5544 9.2742
No log 5.0 310 1.1631 31.2379 9.1452
No log 6.0 372 1.1556 32.39 9.2097
No log 7.0 434 1.1650 32.8728 9.2419
No log 8.0 496 1.1666 28.6033 9.0645
0.9718 9.0 558 1.1976 25.7388 10.0484
0.9718 10.0 620 1.1648 32.5972 9.2258
0.9718 11.0 682 1.1939 31.9682 9.0806
0.9718 12.0 744 1.2021 33.2574 9.371
0.9718 13.0 806 1.2006 32.1413 9.2258
0.9718 14.0 868 1.2208 33.0105 9.3065
0.9718 15.0 930 1.2888 31.9994 9.4194
0.9718 16.0 992 1.2568 33.6234 9.3226
0.0484 17.0 1054 1.2758 32.9602 9.3226
0.0484 18.0 1116 1.2841 33.2857 9.2903
0.0484 19.0 1178 1.2968 32.4006 9.2419
0.0484 20.0 1240 1.3066 32.7878 9.1935
0.0484 21.0 1302 1.3192 32.1068 9.3871
0.0484 22.0 1364 1.3158 32.7501 9.3226
0.0484 23.0 1426 1.3553 33.1188 9.3065
0.0484 24.0 1488 1.3182 33.9851 9.6613
0.0137 25.0 1550 1.3493 32.7566 9.3226
0.0137 26.0 1612 1.3419 33.8387 9.5
0.0137 27.0 1674 1.3501 32.2899 9.371
0.0137 28.0 1736 1.3520 32.1795 9.3226
0.0137 29.0 1798 1.3676 33.7723 9.5645
0.0137 30.0 1860 1.3832 32.8767 9.3548
0.0137 31.0 1922 1.3814 33.3269 9.5
0.0137 32.0 1984 1.3833 32.8231 9.371
0.0108 33.0 2046 1.3828 32.1068 9.4194
0.0108 34.0 2108 1.3976 33.9396 9.4677
0.0108 35.0 2170 1.4015 32.1225 9.1613
0.0108 36.0 2232 1.4058 32.7627 9.371
0.0108 37.0 2294 1.4213 32.1195 9.2581
0.0108 38.0 2356 1.4300 33.0876 9.4355
0.0108 39.0 2418 1.4263 32.7883 9.3387
0.0108 40.0 2480 1.4390 31.3041 9.3387
0.0107 41.0 2542 1.4405 33.2307 9.3871
0.0107 42.0 2604 1.4421 32.5338 9.3871
0.0107 43.0 2666 1.4617 31.5815 9.3548
0.0107 44.0 2728 1.4517 32.2336 9.3226
0.0107 45.0 2790 1.4708 32.5791 9.4194
0.0107 46.0 2852 1.4665 33.5456 9.4516
0.0107 47.0 2914 1.4574 32.4045 9.3871
0.0107 48.0 2976 1.4585 34.6859 9.4677
0.0099 49.0 3038 1.4733 34.7 9.4355
0.0099 50.0 3100 1.4713 34.7405 9.4032
0.0099 51.0 3162 1.4740 34.6316 9.4355
0.0099 52.0 3224 1.4867 35.6172 9.4516
0.0099 53.0 3286 1.4845 34.6718 9.5
0.0099 54.0 3348 1.4891 35.3407 9.4194
0.0099 55.0 3410 1.4868 35.243 9.3871
0.0099 56.0 3472 1.4695 35.326 9.4839
0.0069 57.0 3534 1.4851 35.3597 9.4355
0.0069 58.0 3596 1.4960 34.1363 9.4032
0.0069 59.0 3658 1.4808 34.8965 9.4677
0.0069 60.0 3720 1.4891 34.7792 9.4839
0.0069 61.0 3782 1.4882 34.0964 9.4839
0.0069 62.0 3844 1.4952 34.0563 9.5
0.0069 63.0 3906 1.4995 35.5974 9.4839
0.0069 64.0 3968 1.5149 35.873 9.4194
0.0087 65.0 4030 1.5180 35.3407 9.4516
0.0087 66.0 4092 1.5169 35.9762 9.4677
0.0087 67.0 4154 1.5101 35.8579 9.4677
0.0087 68.0 4216 1.5098 35.2643 9.4677
0.0087 69.0 4278 1.5119 35.2643 9.4677
0.0087 70.0 4340 1.5109 35.0762 9.4194
0.0087 71.0 4402 1.5118 35.3965 9.4839
0.0087 72.0 4464 1.5099 35.3965 9.4839
0.0056 73.0 4526 1.5249 35.2791 9.5
0.0056 74.0 4588 1.5197 35.2791 9.4677
0.0056 75.0 4650 1.5288 35.2021 9.4516
0.0056 76.0 4712 1.5323 35.2021 9.4516
0.0056 77.0 4774 1.5264 35.2021 9.4516
0.0056 78.0 4836 1.5266 35.2021 9.4516
0.0056 79.0 4898 1.5285 35.2021 9.4516
0.0056 80.0 4960 1.5326 35.7955 9.4516
0.0058 81.0 5022 1.5339 35.7955 9.4516
0.0058 82.0 5084 1.5435 35.2021 9.4516
0.0058 83.0 5146 1.5421 35.2021 9.4516
0.0058 84.0 5208 1.5441 35.7955 9.4516
0.0058 85.0 5270 1.5484 35.7955 9.4516
0.0058 86.0 5332 1.5527 35.7955 9.4516
0.0058 87.0 5394 1.5497 35.7955 9.4516
0.0058 88.0 5456 1.5504 35.7955 9.4516
0.0055 89.0 5518 1.5485 35.7955 9.4516
0.0055 90.0 5580 1.5484 35.7955 9.4516
0.0055 91.0 5642 1.5496 35.7955 9.4516
0.0055 92.0 5704 1.5475 35.739 9.4516
0.0055 93.0 5766 1.5438 35.739 9.4516
0.0055 94.0 5828 1.5464 35.739 9.4516
0.0055 95.0 5890 1.5461 35.739 9.4516
0.0055 96.0 5952 1.5467 35.7955 9.4516
0.0045 97.0 6014 1.5452 35.7955 9.4516
0.0045 98.0 6076 1.5449 35.7955 9.4516
0.0045 99.0 6138 1.5449 35.7955 9.4516
0.0045 100.0 6200 1.5449 35.7955 9.4516

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1