mekjr1's picture
update model card README.md
394c775
|
raw
history blame
No virus
8.6 kB
metadata
license: apache-2.0
base_model: Helsinki-NLP/opus-mt-en-es
tags:
  - generated_from_trainer
metrics:
  - bleu
model-index:
  - name: opus-mt-en-es-finetuned-es-to-pbb-v0.1
    results: []

opus-mt-en-es-finetuned-es-to-pbb-v0.1

This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-es on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4966
  • Bleu: 4.7931
  • Gen Len: 75.4033

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 194 2.5190 0.4261 118.0446
No log 2.0 388 2.1735 0.6863 94.2307
2.6561 3.0 582 2.0123 0.8611 95.4628
2.6561 4.0 776 1.9069 1.164 94.7917
2.6561 5.0 970 1.8303 1.3784 91.8423
1.8552 6.0 1164 1.7642 1.9842 85.5714
1.8552 7.0 1358 1.7195 2.0005 89.3467
1.6609 8.0 1552 1.6790 2.0224 85.5595
1.6609 9.0 1746 1.6490 2.3548 86.1726
1.6609 10.0 1940 1.6269 2.3567 86.1652
1.5416 11.0 2134 1.5998 2.8122 84.0179
1.5416 12.0 2328 1.5780 2.5282 83.4792
1.449 13.0 2522 1.5585 2.6159 82.869
1.449 14.0 2716 1.5372 2.8756 81.6518
1.449 15.0 2910 1.5227 3.0051 80.8259
1.3724 16.0 3104 1.5121 3.1957 79.6518
1.3724 17.0 3298 1.5006 2.847 79.2798
1.3724 18.0 3492 1.4927 3.2975 77.375
1.3134 19.0 3686 1.4786 3.3924 76.744
1.3134 20.0 3880 1.4698 3.5146 78.2173
1.2583 21.0 4074 1.4638 3.2835 79.1548
1.2583 22.0 4268 1.4532 3.2862 78.3363
1.2583 23.0 4462 1.4521 3.5943 79.0923
1.2111 24.0 4656 1.4458 3.856 77.4092
1.2111 25.0 4850 1.4426 3.6296 77.9256
1.1664 26.0 5044 1.4384 3.6092 76.442
1.1664 27.0 5238 1.4342 3.7057 79.0357
1.1664 28.0 5432 1.4290 3.5534 77.8452
1.1259 29.0 5626 1.4323 3.8192 77.8304
1.1259 30.0 5820 1.4258 4.0245 76.2872
1.091 31.0 6014 1.4258 3.9815 75.0164
1.091 32.0 6208 1.4252 3.8806 78.3289
1.091 33.0 6402 1.4252 4.0585 76.9896
1.0555 34.0 6596 1.4213 4.1074 75.9777
1.0555 35.0 6790 1.4274 3.9179 79.1533
1.0555 36.0 6984 1.4220 3.8599 76.4717
1.0273 37.0 7178 1.4253 4.1578 77.6101
1.0273 38.0 7372 1.4204 4.0983 78.497
0.9949 39.0 7566 1.4280 4.2085 76.6057
0.9949 40.0 7760 1.4207 4.0804 75.0729
0.9949 41.0 7954 1.4234 4.1249 76.7411
0.9687 42.0 8148 1.4243 4.2974 76.2188
0.9687 43.0 8342 1.4298 4.417 76.4955
0.9432 44.0 8536 1.4241 4.2923 77.0759
0.9432 45.0 8730 1.4292 4.2664 77.5982
0.9432 46.0 8924 1.4316 4.2662 75.2708
0.9203 47.0 9118 1.4273 4.3311 74.3408
0.9203 48.0 9312 1.4265 4.4701 76.0432
0.8967 49.0 9506 1.4335 4.5713 76.7872
0.8967 50.0 9700 1.4336 4.5226 76.9926
0.8967 51.0 9894 1.4335 4.4275 77.7232
0.8732 52.0 10088 1.4416 4.5138 77.2589
0.8732 53.0 10282 1.4412 4.5469 76.0491
0.8732 54.0 10476 1.4347 4.4204 74.4568
0.8563 55.0 10670 1.4396 4.2991 77.0491
0.8563 56.0 10864 1.4448 4.5678 76.7768
0.8368 57.0 11058 1.4468 4.5362 76.1518
0.8368 58.0 11252 1.4487 4.5456 76.0923
0.8368 59.0 11446 1.4517 4.6951 76.692
0.8187 60.0 11640 1.4501 4.6062 75.753
0.8187 61.0 11834 1.4552 4.466 75.5193
0.8031 62.0 12028 1.4547 4.6685 75.8155
0.8031 63.0 12222 1.4593 4.6206 75.0625
0.8031 64.0 12416 1.4570 4.7326 75.7783
0.7885 65.0 12610 1.4586 4.6804 75.5774
0.7885 66.0 12804 1.4661 4.483 76.503
0.7885 67.0 12998 1.4630 4.8575 76.1146
0.7749 68.0 13192 1.4654 4.8867 75.9524
0.7749 69.0 13386 1.4713 4.8378 76.4152
0.7607 70.0 13580 1.4659 4.7737 77.058
0.7607 71.0 13774 1.4740 4.8789 76.3438
0.7607 72.0 13968 1.4738 4.7456 75.9554
0.7494 73.0 14162 1.4733 4.8289 75.811
0.7494 74.0 14356 1.4729 4.7033 75.2247
0.7369 75.0 14550 1.4749 4.7982 75.6815
0.7369 76.0 14744 1.4767 4.8117 75.8839
0.7369 77.0 14938 1.4781 4.5612 75.7188
0.7283 78.0 15132 1.4779 4.7852 75.933
0.7283 79.0 15326 1.4801 4.7405 76.0967
0.7196 80.0 15520 1.4833 4.6466 76.7961
0.7196 81.0 15714 1.4836 4.839 75.2604
0.7196 82.0 15908 1.4853 4.7503 75.9881
0.7102 83.0 16102 1.4907 4.9235 76.244
0.7102 84.0 16296 1.4889 4.8346 75.3259
0.7102 85.0 16490 1.4904 4.7364 75.9539
0.7043 86.0 16684 1.4896 5.0884 75.0208
0.7043 87.0 16878 1.4920 4.6834 75.2173
0.6971 88.0 17072 1.4907 4.7318 75.7128
0.6971 89.0 17266 1.4920 4.7857 75.8586
0.6971 90.0 17460 1.4923 4.661 75.0193
0.6933 91.0 17654 1.4935 4.8224 74.2054
0.6933 92.0 17848 1.4942 4.8344 75.5461
0.6873 93.0 18042 1.4953 4.9447 75.25
0.6873 94.0 18236 1.4949 4.734 74.7113
0.6873 95.0 18430 1.4946 4.7811 75.3973
0.6853 96.0 18624 1.4965 4.7307 75.6414
0.6853 97.0 18818 1.4957 4.8139 75.5878
0.682 98.0 19012 1.4963 4.799 75.2857
0.682 99.0 19206 1.4965 4.8997 75.2515
0.682 100.0 19400 1.4966 4.7931 75.4033

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.4
  • Tokenizers 0.13.3