salbatarni commited on
Commit
179af21
1 Parent(s): b01ec9b

Training in progress, step 340

Browse files
README.md ADDED
@@ -0,0 +1,225 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: google-bert/bert-base-cased
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: bert_baseline_prompt_adherence_task4_fold4
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # bert_baseline_prompt_adherence_task4_fold4
15
+
16
+ This model is a fine-tuned version of [google-bert/bert-base-cased](https://huggingface.co/google-bert/bert-base-cased) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.3476
19
+ - Qwk: 0.7157
20
+ - Mse: 0.3476
21
+
22
+ ## Model description
23
+
24
+ More information needed
25
+
26
+ ## Intended uses & limitations
27
+
28
+ More information needed
29
+
30
+ ## Training and evaluation data
31
+
32
+ More information needed
33
+
34
+ ## Training procedure
35
+
36
+ ### Training hyperparameters
37
+
38
+ The following hyperparameters were used during training:
39
+ - learning_rate: 2e-05
40
+ - train_batch_size: 16
41
+ - eval_batch_size: 16
42
+ - seed: 42
43
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
+ - lr_scheduler_type: linear
45
+ - num_epochs: 5
46
+
47
+ ### Training results
48
+
49
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
50
+ |:-------------:|:------:|:----:|:---------------:|:------:|:------:|
51
+ | No log | 0.0299 | 2 | 1.1588 | 0.0 | 1.1588 |
52
+ | No log | 0.0597 | 4 | 0.8767 | 0.0 | 0.8767 |
53
+ | No log | 0.0896 | 6 | 0.7414 | 0.2538 | 0.7414 |
54
+ | No log | 0.1194 | 8 | 0.7026 | 0.3935 | 0.7026 |
55
+ | No log | 0.1493 | 10 | 0.6746 | 0.3562 | 0.6746 |
56
+ | No log | 0.1791 | 12 | 0.6514 | 0.3180 | 0.6514 |
57
+ | No log | 0.2090 | 14 | 0.6256 | 0.3383 | 0.6256 |
58
+ | No log | 0.2388 | 16 | 0.5935 | 0.3605 | 0.5935 |
59
+ | No log | 0.2687 | 18 | 0.5678 | 0.3687 | 0.5678 |
60
+ | No log | 0.2985 | 20 | 0.5458 | 0.4041 | 0.5458 |
61
+ | No log | 0.3284 | 22 | 0.5102 | 0.3909 | 0.5102 |
62
+ | No log | 0.3582 | 24 | 0.5212 | 0.3760 | 0.5212 |
63
+ | No log | 0.3881 | 26 | 0.4986 | 0.3984 | 0.4986 |
64
+ | No log | 0.4179 | 28 | 0.4564 | 0.3964 | 0.4564 |
65
+ | No log | 0.4478 | 30 | 0.4805 | 0.3620 | 0.4805 |
66
+ | No log | 0.4776 | 32 | 0.5001 | 0.3296 | 0.5001 |
67
+ | No log | 0.5075 | 34 | 0.4408 | 0.4153 | 0.4408 |
68
+ | No log | 0.5373 | 36 | 0.4298 | 0.4730 | 0.4298 |
69
+ | No log | 0.5672 | 38 | 0.4277 | 0.5450 | 0.4277 |
70
+ | No log | 0.5970 | 40 | 0.4270 | 0.5072 | 0.4270 |
71
+ | No log | 0.6269 | 42 | 0.4253 | 0.4982 | 0.4253 |
72
+ | No log | 0.6567 | 44 | 0.4159 | 0.5405 | 0.4159 |
73
+ | No log | 0.6866 | 46 | 0.4086 | 0.5737 | 0.4086 |
74
+ | No log | 0.7164 | 48 | 0.4036 | 0.5422 | 0.4036 |
75
+ | No log | 0.7463 | 50 | 0.3994 | 0.5436 | 0.3994 |
76
+ | No log | 0.7761 | 52 | 0.3960 | 0.5891 | 0.3960 |
77
+ | No log | 0.8060 | 54 | 0.4724 | 0.6990 | 0.4724 |
78
+ | No log | 0.8358 | 56 | 0.6041 | 0.7165 | 0.6041 |
79
+ | No log | 0.8657 | 58 | 0.7484 | 0.6732 | 0.7484 |
80
+ | No log | 0.8955 | 60 | 0.6230 | 0.7096 | 0.6230 |
81
+ | No log | 0.9254 | 62 | 0.4314 | 0.6097 | 0.4314 |
82
+ | No log | 0.9552 | 64 | 0.3817 | 0.4898 | 0.3817 |
83
+ | No log | 0.9851 | 66 | 0.4007 | 0.4380 | 0.4007 |
84
+ | No log | 1.0149 | 68 | 0.3692 | 0.4730 | 0.3692 |
85
+ | No log | 1.0448 | 70 | 0.3617 | 0.6006 | 0.3617 |
86
+ | No log | 1.0746 | 72 | 0.3797 | 0.6409 | 0.3797 |
87
+ | No log | 1.1045 | 74 | 0.3657 | 0.6124 | 0.3657 |
88
+ | No log | 1.1343 | 76 | 0.3668 | 0.5849 | 0.3668 |
89
+ | No log | 1.1642 | 78 | 0.3721 | 0.6166 | 0.3721 |
90
+ | No log | 1.1940 | 80 | 0.3940 | 0.6391 | 0.3940 |
91
+ | No log | 1.2239 | 82 | 0.4132 | 0.6485 | 0.4132 |
92
+ | No log | 1.2537 | 84 | 0.3830 | 0.6259 | 0.3830 |
93
+ | No log | 1.2836 | 86 | 0.3685 | 0.5396 | 0.3685 |
94
+ | No log | 1.3134 | 88 | 0.3701 | 0.5209 | 0.3701 |
95
+ | No log | 1.3433 | 90 | 0.3870 | 0.4861 | 0.3870 |
96
+ | No log | 1.3731 | 92 | 0.3636 | 0.4971 | 0.3636 |
97
+ | No log | 1.4030 | 94 | 0.3397 | 0.5908 | 0.3397 |
98
+ | No log | 1.4328 | 96 | 0.3460 | 0.6268 | 0.3460 |
99
+ | No log | 1.4627 | 98 | 0.3477 | 0.6372 | 0.3477 |
100
+ | No log | 1.4925 | 100 | 0.3311 | 0.6057 | 0.3311 |
101
+ | No log | 1.5224 | 102 | 0.3364 | 0.6391 | 0.3364 |
102
+ | No log | 1.5522 | 104 | 0.3679 | 0.6766 | 0.3679 |
103
+ | No log | 1.5821 | 106 | 0.3366 | 0.6360 | 0.3366 |
104
+ | No log | 1.6119 | 108 | 0.3259 | 0.6022 | 0.3259 |
105
+ | No log | 1.6418 | 110 | 0.3264 | 0.5540 | 0.3264 |
106
+ | No log | 1.6716 | 112 | 0.3225 | 0.6017 | 0.3225 |
107
+ | No log | 1.7015 | 114 | 0.3607 | 0.6784 | 0.3607 |
108
+ | No log | 1.7313 | 116 | 0.3714 | 0.6997 | 0.3714 |
109
+ | No log | 1.7612 | 118 | 0.3879 | 0.7174 | 0.3879 |
110
+ | No log | 1.7910 | 120 | 0.3918 | 0.7240 | 0.3918 |
111
+ | No log | 1.8209 | 122 | 0.3504 | 0.6756 | 0.3504 |
112
+ | No log | 1.8507 | 124 | 0.3279 | 0.5898 | 0.3279 |
113
+ | No log | 1.8806 | 126 | 0.3630 | 0.4773 | 0.3630 |
114
+ | No log | 1.9104 | 128 | 0.3506 | 0.4846 | 0.3506 |
115
+ | No log | 1.9403 | 130 | 0.3207 | 0.5862 | 0.3207 |
116
+ | No log | 1.9701 | 132 | 0.3460 | 0.6590 | 0.3460 |
117
+ | No log | 2.0 | 134 | 0.4251 | 0.7425 | 0.4251 |
118
+ | No log | 2.0299 | 136 | 0.4846 | 0.7500 | 0.4846 |
119
+ | No log | 2.0597 | 138 | 0.4460 | 0.7465 | 0.4460 |
120
+ | No log | 2.0896 | 140 | 0.3581 | 0.7021 | 0.3581 |
121
+ | No log | 2.1194 | 142 | 0.3155 | 0.6266 | 0.3155 |
122
+ | No log | 2.1493 | 144 | 0.3210 | 0.5979 | 0.3210 |
123
+ | No log | 2.1791 | 146 | 0.3223 | 0.6157 | 0.3223 |
124
+ | No log | 2.2090 | 148 | 0.3245 | 0.6373 | 0.3245 |
125
+ | No log | 2.2388 | 150 | 0.3294 | 0.6616 | 0.3294 |
126
+ | No log | 2.2687 | 152 | 0.3221 | 0.6562 | 0.3221 |
127
+ | No log | 2.2985 | 154 | 0.3220 | 0.6599 | 0.3220 |
128
+ | No log | 2.3284 | 156 | 0.3226 | 0.6608 | 0.3226 |
129
+ | No log | 2.3582 | 158 | 0.3204 | 0.6211 | 0.3204 |
130
+ | No log | 2.3881 | 160 | 0.3173 | 0.6165 | 0.3173 |
131
+ | No log | 2.4179 | 162 | 0.3106 | 0.6175 | 0.3106 |
132
+ | No log | 2.4478 | 164 | 0.3034 | 0.6379 | 0.3034 |
133
+ | No log | 2.4776 | 166 | 0.3096 | 0.6708 | 0.3096 |
134
+ | No log | 2.5075 | 168 | 0.3477 | 0.7297 | 0.3477 |
135
+ | No log | 2.5373 | 170 | 0.3540 | 0.7388 | 0.3540 |
136
+ | No log | 2.5672 | 172 | 0.3500 | 0.7329 | 0.3500 |
137
+ | No log | 2.5970 | 174 | 0.3318 | 0.7146 | 0.3318 |
138
+ | No log | 2.6269 | 176 | 0.3029 | 0.6593 | 0.3029 |
139
+ | No log | 2.6567 | 178 | 0.3003 | 0.6152 | 0.3003 |
140
+ | No log | 2.6866 | 180 | 0.2979 | 0.6388 | 0.2979 |
141
+ | No log | 2.7164 | 182 | 0.3064 | 0.6896 | 0.3064 |
142
+ | No log | 2.7463 | 184 | 0.3009 | 0.6853 | 0.3009 |
143
+ | No log | 2.7761 | 186 | 0.3026 | 0.6912 | 0.3026 |
144
+ | No log | 2.8060 | 188 | 0.3075 | 0.6944 | 0.3075 |
145
+ | No log | 2.8358 | 190 | 0.2952 | 0.6718 | 0.2952 |
146
+ | No log | 2.8657 | 192 | 0.2949 | 0.6647 | 0.2949 |
147
+ | No log | 2.8955 | 194 | 0.3048 | 0.6968 | 0.3048 |
148
+ | No log | 2.9254 | 196 | 0.3162 | 0.7032 | 0.3162 |
149
+ | No log | 2.9552 | 198 | 0.3124 | 0.7022 | 0.3124 |
150
+ | No log | 2.9851 | 200 | 0.2969 | 0.6534 | 0.2969 |
151
+ | No log | 3.0149 | 202 | 0.3007 | 0.6076 | 0.3007 |
152
+ | No log | 3.0448 | 204 | 0.3018 | 0.6423 | 0.3018 |
153
+ | No log | 3.0746 | 206 | 0.3224 | 0.6994 | 0.3224 |
154
+ | No log | 3.1045 | 208 | 0.3478 | 0.7151 | 0.3478 |
155
+ | No log | 3.1343 | 210 | 0.3360 | 0.7104 | 0.3360 |
156
+ | No log | 3.1642 | 212 | 0.3160 | 0.6702 | 0.3160 |
157
+ | No log | 3.1940 | 214 | 0.3166 | 0.6403 | 0.3166 |
158
+ | No log | 3.2239 | 216 | 0.3182 | 0.6657 | 0.3182 |
159
+ | No log | 3.2537 | 218 | 0.3290 | 0.6788 | 0.3290 |
160
+ | No log | 3.2836 | 220 | 0.3312 | 0.6900 | 0.3312 |
161
+ | No log | 3.3134 | 222 | 0.3369 | 0.7046 | 0.3369 |
162
+ | No log | 3.3433 | 224 | 0.3400 | 0.7114 | 0.3400 |
163
+ | No log | 3.3731 | 226 | 0.3190 | 0.7033 | 0.3190 |
164
+ | No log | 3.4030 | 228 | 0.3029 | 0.6609 | 0.3029 |
165
+ | No log | 3.4328 | 230 | 0.3026 | 0.6586 | 0.3026 |
166
+ | No log | 3.4627 | 232 | 0.3054 | 0.6616 | 0.3054 |
167
+ | No log | 3.4925 | 234 | 0.3131 | 0.6950 | 0.3131 |
168
+ | No log | 3.5224 | 236 | 0.3265 | 0.7028 | 0.3265 |
169
+ | No log | 3.5522 | 238 | 0.3265 | 0.6953 | 0.3265 |
170
+ | No log | 3.5821 | 240 | 0.3077 | 0.6907 | 0.3077 |
171
+ | No log | 3.6119 | 242 | 0.2987 | 0.6450 | 0.2987 |
172
+ | No log | 3.6418 | 244 | 0.2990 | 0.6465 | 0.2990 |
173
+ | No log | 3.6716 | 246 | 0.3033 | 0.6876 | 0.3033 |
174
+ | No log | 3.7015 | 248 | 0.3093 | 0.6906 | 0.3093 |
175
+ | No log | 3.7313 | 250 | 0.3171 | 0.6960 | 0.3171 |
176
+ | No log | 3.7612 | 252 | 0.3283 | 0.7112 | 0.3283 |
177
+ | No log | 3.7910 | 254 | 0.3575 | 0.7331 | 0.3575 |
178
+ | No log | 3.8209 | 256 | 0.3670 | 0.7413 | 0.3670 |
179
+ | No log | 3.8507 | 258 | 0.3570 | 0.7323 | 0.3570 |
180
+ | No log | 3.8806 | 260 | 0.3232 | 0.6992 | 0.3232 |
181
+ | No log | 3.9104 | 262 | 0.3004 | 0.6685 | 0.3004 |
182
+ | No log | 3.9403 | 264 | 0.3038 | 0.6015 | 0.3038 |
183
+ | No log | 3.9701 | 266 | 0.3094 | 0.5814 | 0.3094 |
184
+ | No log | 4.0 | 268 | 0.3024 | 0.5967 | 0.3024 |
185
+ | No log | 4.0299 | 270 | 0.2941 | 0.6332 | 0.2941 |
186
+ | No log | 4.0597 | 272 | 0.2933 | 0.6493 | 0.2933 |
187
+ | No log | 4.0896 | 274 | 0.3032 | 0.6913 | 0.3032 |
188
+ | No log | 4.1194 | 276 | 0.3164 | 0.7051 | 0.3164 |
189
+ | No log | 4.1493 | 278 | 0.3232 | 0.7005 | 0.3232 |
190
+ | No log | 4.1791 | 280 | 0.3245 | 0.7079 | 0.3245 |
191
+ | No log | 4.2090 | 282 | 0.3186 | 0.6997 | 0.3186 |
192
+ | No log | 4.2388 | 284 | 0.3215 | 0.7003 | 0.3215 |
193
+ | No log | 4.2687 | 286 | 0.3306 | 0.7106 | 0.3306 |
194
+ | No log | 4.2985 | 288 | 0.3441 | 0.7217 | 0.3441 |
195
+ | No log | 4.3284 | 290 | 0.3486 | 0.7234 | 0.3486 |
196
+ | No log | 4.3582 | 292 | 0.3376 | 0.7139 | 0.3376 |
197
+ | No log | 4.3881 | 294 | 0.3203 | 0.6960 | 0.3203 |
198
+ | No log | 4.4179 | 296 | 0.3093 | 0.6796 | 0.3093 |
199
+ | No log | 4.4478 | 298 | 0.3051 | 0.6659 | 0.3051 |
200
+ | No log | 4.4776 | 300 | 0.3045 | 0.6514 | 0.3045 |
201
+ | No log | 4.5075 | 302 | 0.3064 | 0.6497 | 0.3064 |
202
+ | No log | 4.5373 | 304 | 0.3071 | 0.6474 | 0.3071 |
203
+ | No log | 4.5672 | 306 | 0.3066 | 0.6539 | 0.3066 |
204
+ | No log | 4.5970 | 308 | 0.3081 | 0.6546 | 0.3081 |
205
+ | No log | 4.6269 | 310 | 0.3140 | 0.6855 | 0.3140 |
206
+ | No log | 4.6567 | 312 | 0.3241 | 0.7028 | 0.3241 |
207
+ | No log | 4.6866 | 314 | 0.3381 | 0.7070 | 0.3381 |
208
+ | No log | 4.7164 | 316 | 0.3531 | 0.7234 | 0.3531 |
209
+ | No log | 4.7463 | 318 | 0.3691 | 0.7275 | 0.3691 |
210
+ | No log | 4.7761 | 320 | 0.3769 | 0.7308 | 0.3769 |
211
+ | No log | 4.8060 | 322 | 0.3765 | 0.7286 | 0.3765 |
212
+ | No log | 4.8358 | 324 | 0.3708 | 0.7299 | 0.3708 |
213
+ | No log | 4.8657 | 326 | 0.3639 | 0.7252 | 0.3639 |
214
+ | No log | 4.8955 | 328 | 0.3573 | 0.7268 | 0.3573 |
215
+ | No log | 4.9254 | 330 | 0.3528 | 0.7201 | 0.3528 |
216
+ | No log | 4.9552 | 332 | 0.3490 | 0.7169 | 0.3490 |
217
+ | No log | 4.9851 | 334 | 0.3476 | 0.7157 | 0.3476 |
218
+
219
+
220
+ ### Framework versions
221
+
222
+ - Transformers 4.42.3
223
+ - Pytorch 2.1.2
224
+ - Datasets 2.20.0
225
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,33 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "google-bert/bert-base-cased",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "gradient_checkpointing": false,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "id2label": {
13
+ "0": "LABEL_0"
14
+ },
15
+ "initializer_range": 0.02,
16
+ "intermediate_size": 3072,
17
+ "label2id": {
18
+ "LABEL_0": 0
19
+ },
20
+ "layer_norm_eps": 1e-12,
21
+ "max_position_embeddings": 512,
22
+ "model_type": "bert",
23
+ "num_attention_heads": 12,
24
+ "num_hidden_layers": 12,
25
+ "pad_token_id": 0,
26
+ "position_embedding_type": "absolute",
27
+ "problem_type": "regression",
28
+ "torch_dtype": "float32",
29
+ "transformers_version": "4.42.3",
30
+ "type_vocab_size": 2,
31
+ "use_cache": true,
32
+ "vocab_size": 28996
33
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:576d09538f5eb91e422ee73108a16e98edf1fbedea63443657d12e44340cc87b
3
+ size 433267692
runs/Aug22_10-22-58_0095ffe889f2/events.out.tfevents.1724322179.0095ffe889f2.25.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:673f097644d8d0f75897c97e97585c7f0ee76bcdc115d91761bd4ed178309d9c
3
+ size 63971
runs/Aug22_10-45-22_0095ffe889f2/events.out.tfevents.1724323523.0095ffe889f2.25.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1ebda0e34cb83be08cdea5cefc0c57618aa0899ef4d8a26edf843106fe8a0468
3
+ size 63971
runs/Aug22_11-07-47_0095ffe889f2/events.out.tfevents.1724324868.0095ffe889f2.25.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:874c807cee086c1d72ca7c792e67ef154ce5374ffce5f9abaaa485d6298a0cce
3
+ size 63971
runs/Aug22_11-30-11_0095ffe889f2/events.out.tfevents.1724326212.0095ffe889f2.25.3 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:09b46bf89787d3a48dbe1b63242683ee3094a427ed82ee047cebe8e91a9b583b
3
+ size 63971
runs/Aug22_11-52-37_0095ffe889f2/events.out.tfevents.1724327557.0095ffe889f2.25.4 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:387e071371e64d86ea139ad0bc8687fdbdfe33bb792e775b4893ac72cb6a7a2e
3
+ size 63971
runs/Aug22_12-15-04_0095ffe889f2/events.out.tfevents.1724328905.0095ffe889f2.25.5 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4c0192001d8dfd988cbe3ce2a034448b244ddc7f293be3ff923cff9ec9e3c5a6
3
+ size 65796
runs/Aug22_12-38-55_0095ffe889f2/events.out.tfevents.1724330336.0095ffe889f2.25.6 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2c25e26d9946cdcb8f123c1592d9c43e30b1ad92a28160c8f964dc44f7362271
3
+ size 65796
runs/Aug22_13-02-35_0095ffe889f2/events.out.tfevents.1724331756.0095ffe889f2.25.7 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1e13c79402a9078061934f07aa961a32f5f9d146a8beb48fa38807e84cd9e704
3
+ size 65796
runs/Aug22_13-26-05_0095ffe889f2/events.out.tfevents.1724333166.0095ffe889f2.25.8 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3c0eaabdaab341305c413bc3974bba65caf735c84e33698b2aeb5b2059177a4e
3
+ size 65796
runs/Aug22_13-49-43_0095ffe889f2/events.out.tfevents.1724334584.0095ffe889f2.25.9 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f474bf5d09242cb01e28bf66b4550d85ac50b054a94cb39d3d72ac35548d2c46
3
+ size 65796
runs/Aug22_14-13-13_0095ffe889f2/events.out.tfevents.1724335995.0095ffe889f2.25.10 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:070c58429c7e4eb4326dbb51bd7484b0946e43800512a2a3d4d0ef5af8b3e48d
3
+ size 66891
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:baaf43b206caf2d55f8f6be96edbf246ecc692baaf2afc321a1565e2a9b5a99a
3
+ size 5176