cherifkhalifah commited on
Commit
a51f0f0
1 Parent(s): bb8969a

End of training

Browse files
README.md ADDED
@@ -0,0 +1,161 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: Helsinki-NLP/opus-mt-en-ar
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - bleu
8
+ model-index:
9
+ - name: Tounsify-v0.9-shuffle
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # Tounsify-v0.9-shuffle
17
+
18
+ This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-ar](https://huggingface.co/Helsinki-NLP/opus-mt-en-ar) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 1.2171
21
+ - Bleu: 47.287
22
+ - Gen Len: 9.1774
23
+
24
+ ## Model description
25
+
26
+ More information needed
27
+
28
+ ## Intended uses & limitations
29
+
30
+ More information needed
31
+
32
+ ## Training and evaluation data
33
+
34
+ More information needed
35
+
36
+ ## Training procedure
37
+
38
+ ### Training hyperparameters
39
+
40
+ The following hyperparameters were used during training:
41
+ - learning_rate: 2e-05
42
+ - train_batch_size: 4
43
+ - eval_batch_size: 4
44
+ - seed: 42
45
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
+ - lr_scheduler_type: linear
47
+ - num_epochs: 100
48
+ - mixed_precision_training: Native AMP
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
53
+ |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|
54
+ | No log | 1.0 | 62 | 2.2319 | 11.5922 | 8.3226 |
55
+ | No log | 2.0 | 124 | 1.4979 | 22.8539 | 8.3871 |
56
+ | No log | 3.0 | 186 | 1.1749 | 31.2278 | 8.5323 |
57
+ | No log | 4.0 | 248 | 1.0500 | 39.4966 | 8.7097 |
58
+ | No log | 5.0 | 310 | 0.9562 | 42.3858 | 8.7742 |
59
+ | No log | 6.0 | 372 | 0.9306 | 43.1436 | 8.6935 |
60
+ | No log | 7.0 | 434 | 0.8928 | 42.3849 | 8.8387 |
61
+ | No log | 8.0 | 496 | 0.9243 | 42.8107 | 8.8548 |
62
+ | 0.9876 | 9.0 | 558 | 0.9293 | 44.3329 | 8.8548 |
63
+ | 0.9876 | 10.0 | 620 | 0.9398 | 42.859 | 8.871 |
64
+ | 0.9876 | 11.0 | 682 | 0.9637 | 44.6861 | 8.8548 |
65
+ | 0.9876 | 12.0 | 744 | 0.9514 | 45.1661 | 8.8387 |
66
+ | 0.9876 | 13.0 | 806 | 0.9780 | 45.5317 | 8.8226 |
67
+ | 0.9876 | 14.0 | 868 | 0.9832 | 48.3237 | 8.8548 |
68
+ | 0.9876 | 15.0 | 930 | 0.9618 | 49.9886 | 9.0484 |
69
+ | 0.9876 | 16.0 | 992 | 0.9980 | 47.1846 | 8.9516 |
70
+ | 0.0522 | 17.0 | 1054 | 0.9758 | 45.6558 | 8.9839 |
71
+ | 0.0522 | 18.0 | 1116 | 0.9907 | 45.325 | 9.0 |
72
+ | 0.0522 | 19.0 | 1178 | 1.0234 | 48.1955 | 8.9194 |
73
+ | 0.0522 | 20.0 | 1240 | 1.0339 | 47.0583 | 8.9839 |
74
+ | 0.0522 | 21.0 | 1302 | 1.0129 | 49.2604 | 8.8871 |
75
+ | 0.0522 | 22.0 | 1364 | 1.0407 | 49.847 | 8.8871 |
76
+ | 0.0522 | 23.0 | 1426 | 1.0656 | 48.4962 | 8.9839 |
77
+ | 0.0522 | 24.0 | 1488 | 1.0504 | 48.3458 | 8.9839 |
78
+ | 0.0153 | 25.0 | 1550 | 1.0556 | 49.455 | 9.0161 |
79
+ | 0.0153 | 26.0 | 1612 | 1.0522 | 48.9644 | 9.0323 |
80
+ | 0.0153 | 27.0 | 1674 | 1.0793 | 48.7056 | 8.9839 |
81
+ | 0.0153 | 28.0 | 1736 | 1.0859 | 48.8805 | 8.9839 |
82
+ | 0.0153 | 29.0 | 1798 | 1.1362 | 48.306 | 9.0806 |
83
+ | 0.0153 | 30.0 | 1860 | 1.0573 | 51.8905 | 9.2097 |
84
+ | 0.0153 | 31.0 | 1922 | 1.1220 | 48.3591 | 9.0806 |
85
+ | 0.0153 | 32.0 | 1984 | 1.0879 | 49.0288 | 9.129 |
86
+ | 0.0097 | 33.0 | 2046 | 1.1219 | 50.593 | 9.1129 |
87
+ | 0.0097 | 34.0 | 2108 | 1.1439 | 49.1391 | 9.0 |
88
+ | 0.0097 | 35.0 | 2170 | 1.1265 | 50.5195 | 9.0323 |
89
+ | 0.0097 | 36.0 | 2232 | 1.1031 | 50.2673 | 9.0806 |
90
+ | 0.0097 | 37.0 | 2294 | 1.1418 | 51.3256 | 8.9839 |
91
+ | 0.0097 | 38.0 | 2356 | 1.1419 | 50.8617 | 9.0968 |
92
+ | 0.0097 | 39.0 | 2418 | 1.1166 | 51.2853 | 9.1452 |
93
+ | 0.0097 | 40.0 | 2480 | 1.1309 | 50.6103 | 9.0806 |
94
+ | 0.0082 | 41.0 | 2542 | 1.1501 | 50.7017 | 9.0 |
95
+ | 0.0082 | 42.0 | 2604 | 1.1108 | 51.6167 | 9.0806 |
96
+ | 0.0082 | 43.0 | 2666 | 1.1176 | 51.1365 | 9.0968 |
97
+ | 0.0082 | 44.0 | 2728 | 1.1544 | 49.703 | 9.0645 |
98
+ | 0.0082 | 45.0 | 2790 | 1.1655 | 51.432 | 9.1935 |
99
+ | 0.0082 | 46.0 | 2852 | 1.1460 | 50.1011 | 9.1774 |
100
+ | 0.0082 | 47.0 | 2914 | 1.1377 | 50.0643 | 9.129 |
101
+ | 0.0082 | 48.0 | 2976 | 1.1406 | 50.1912 | 9.1129 |
102
+ | 0.0081 | 49.0 | 3038 | 1.1452 | 47.2465 | 9.1774 |
103
+ | 0.0081 | 50.0 | 3100 | 1.1532 | 49.9986 | 9.0806 |
104
+ | 0.0081 | 51.0 | 3162 | 1.1596 | 47.8461 | 9.0806 |
105
+ | 0.0081 | 52.0 | 3224 | 1.1643 | 48.3596 | 9.0968 |
106
+ | 0.0081 | 53.0 | 3286 | 1.1577 | 47.1237 | 9.0806 |
107
+ | 0.0081 | 54.0 | 3348 | 1.1599 | 48.6692 | 9.0968 |
108
+ | 0.0081 | 55.0 | 3410 | 1.1613 | 48.1806 | 9.0806 |
109
+ | 0.0081 | 56.0 | 3472 | 1.1668 | 47.5471 | 9.1613 |
110
+ | 0.0069 | 57.0 | 3534 | 1.1749 | 50.0805 | 9.0806 |
111
+ | 0.0069 | 58.0 | 3596 | 1.1784 | 49.3841 | 9.1774 |
112
+ | 0.0069 | 59.0 | 3658 | 1.1666 | 49.4183 | 9.0645 |
113
+ | 0.0069 | 60.0 | 3720 | 1.1768 | 47.8488 | 9.1774 |
114
+ | 0.0069 | 61.0 | 3782 | 1.1908 | 48.7428 | 9.0968 |
115
+ | 0.0069 | 62.0 | 3844 | 1.1882 | 49.2957 | 8.9677 |
116
+ | 0.0069 | 63.0 | 3906 | 1.1869 | 49.5255 | 9.0323 |
117
+ | 0.0069 | 64.0 | 3968 | 1.1866 | 48.8917 | 9.0161 |
118
+ | 0.0068 | 65.0 | 4030 | 1.1858 | 48.5308 | 9.0968 |
119
+ | 0.0068 | 66.0 | 4092 | 1.1951 | 49.2041 | 9.0806 |
120
+ | 0.0068 | 67.0 | 4154 | 1.1828 | 49.1255 | 9.0806 |
121
+ | 0.0068 | 68.0 | 4216 | 1.1923 | 48.0252 | 9.0484 |
122
+ | 0.0068 | 69.0 | 4278 | 1.1947 | 48.0764 | 9.1129 |
123
+ | 0.0068 | 70.0 | 4340 | 1.1927 | 48.2729 | 9.0484 |
124
+ | 0.0068 | 71.0 | 4402 | 1.1907 | 47.9908 | 9.129 |
125
+ | 0.0068 | 72.0 | 4464 | 1.1920 | 48.8939 | 9.0968 |
126
+ | 0.0062 | 73.0 | 4526 | 1.1939 | 49.0374 | 9.0968 |
127
+ | 0.0062 | 74.0 | 4588 | 1.1952 | 49.0374 | 9.0968 |
128
+ | 0.0062 | 75.0 | 4650 | 1.1954 | 49.2333 | 9.0323 |
129
+ | 0.0062 | 76.0 | 4712 | 1.1951 | 48.3221 | 9.1129 |
130
+ | 0.0062 | 77.0 | 4774 | 1.1971 | 48.3221 | 9.1129 |
131
+ | 0.0062 | 78.0 | 4836 | 1.1978 | 49.5615 | 9.1129 |
132
+ | 0.0062 | 79.0 | 4898 | 1.1994 | 48.947 | 9.0484 |
133
+ | 0.0062 | 80.0 | 4960 | 1.2009 | 48.0436 | 9.0806 |
134
+ | 0.0045 | 81.0 | 5022 | 1.2021 | 47.9908 | 9.129 |
135
+ | 0.0045 | 82.0 | 5084 | 1.2048 | 47.9908 | 9.129 |
136
+ | 0.0045 | 83.0 | 5146 | 1.2045 | 49.5615 | 9.0968 |
137
+ | 0.0045 | 84.0 | 5208 | 1.2065 | 49.4183 | 9.0968 |
138
+ | 0.0045 | 85.0 | 5270 | 1.2081 | 48.9864 | 9.0968 |
139
+ | 0.0045 | 86.0 | 5332 | 1.2131 | 46.327 | 9.0968 |
140
+ | 0.0045 | 87.0 | 5394 | 1.2144 | 47.2291 | 9.1452 |
141
+ | 0.0045 | 88.0 | 5456 | 1.2135 | 47.2291 | 9.1452 |
142
+ | 0.0047 | 89.0 | 5518 | 1.2163 | 46.8533 | 9.1452 |
143
+ | 0.0047 | 90.0 | 5580 | 1.2207 | 47.3713 | 9.1452 |
144
+ | 0.0047 | 91.0 | 5642 | 1.2188 | 47.3713 | 9.1452 |
145
+ | 0.0047 | 92.0 | 5704 | 1.2193 | 47.3713 | 9.1452 |
146
+ | 0.0047 | 93.0 | 5766 | 1.2188 | 48.9917 | 9.1452 |
147
+ | 0.0047 | 94.0 | 5828 | 1.2175 | 47.2291 | 9.1452 |
148
+ | 0.0047 | 95.0 | 5890 | 1.2177 | 48.9917 | 9.1452 |
149
+ | 0.0047 | 96.0 | 5952 | 1.2177 | 47.3713 | 9.1452 |
150
+ | 0.0043 | 97.0 | 6014 | 1.2165 | 47.3713 | 9.1452 |
151
+ | 0.0043 | 98.0 | 6076 | 1.2167 | 47.287 | 9.1774 |
152
+ | 0.0043 | 99.0 | 6138 | 1.2169 | 47.287 | 9.1774 |
153
+ | 0.0043 | 100.0 | 6200 | 1.2171 | 47.287 | 9.1774 |
154
+
155
+
156
+ ### Framework versions
157
+
158
+ - Transformers 4.41.2
159
+ - Pytorch 2.3.0+cu121
160
+ - Datasets 2.20.0
161
+ - Tokenizers 0.19.1
generation_config.json ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bad_words_ids": [
3
+ [
4
+ 62801
5
+ ]
6
+ ],
7
+ "bos_token_id": 0,
8
+ "decoder_start_token_id": 62801,
9
+ "eos_token_id": 0,
10
+ "forced_eos_token_id": 0,
11
+ "max_length": 512,
12
+ "num_beams": 4,
13
+ "pad_token_id": 62801,
14
+ "renormalize_logits": true,
15
+ "transformers_version": "4.41.2"
16
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e5662d976441b541e432e1ec7494ba4bf0215825470b7bf0afba71fee96bc025
3
  size 305452744
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:448c8f41da6f2f4d20ac59972cee75210a15eeee80155f4a0854c8bbde08be74
3
  size 305452744
runs/Jul06_06-44-37_dcf11d410645/events.out.tfevents.1720248278.dcf11d410645.1178.1 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:04922f37031dc81dc538ac878caa1ecf63c6bfbd2d420572c6dc7b115816ca5a
3
- size 43750
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1fa147c465348b62dfad85169d5463e12f280d0254939fb1698fce4ee34d13f2
3
+ size 45584