salbatarni
commited on
Commit
•
c898158
1
Parent(s):
60186a7
Training in progress, step 335
Browse files- README.md +173 -39
- model.safetensors +1 -1
- runs/Aug22_10-22-58_0095ffe889f2/events.out.tfevents.1724322179.0095ffe889f2.25.0 +3 -0
- runs/Aug22_10-45-22_0095ffe889f2/events.out.tfevents.1724323523.0095ffe889f2.25.1 +3 -0
- runs/Aug22_11-07-47_0095ffe889f2/events.out.tfevents.1724324868.0095ffe889f2.25.2 +3 -0
- runs/Aug22_11-30-11_0095ffe889f2/events.out.tfevents.1724326212.0095ffe889f2.25.3 +3 -0
- runs/Aug22_11-52-37_0095ffe889f2/events.out.tfevents.1724327557.0095ffe889f2.25.4 +3 -0
- runs/Aug22_12-15-04_0095ffe889f2/events.out.tfevents.1724328905.0095ffe889f2.25.5 +3 -0
- runs/Aug22_12-38-55_0095ffe889f2/events.out.tfevents.1724330336.0095ffe889f2.25.6 +3 -0
- training_args.bin +1 -1
README.md
CHANGED
@@ -4,20 +4,20 @@ base_model: google-bert/bert-base-cased
|
|
4 |
tags:
|
5 |
- generated_from_trainer
|
6 |
model-index:
|
7 |
-
- name:
|
8 |
results: []
|
9 |
---
|
10 |
|
11 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
12 |
should probably proofread and complete it, then remove this comment. -->
|
13 |
|
14 |
-
#
|
15 |
|
16 |
This model is a fine-tuned version of [google-bert/bert-base-cased](https://huggingface.co/google-bert/bert-base-cased) on the None dataset.
|
17 |
It achieves the following results on the evaluation set:
|
18 |
-
- Loss: 0.
|
19 |
-
- Qwk: 0.
|
20 |
-
- Mse: 0.
|
21 |
|
22 |
## Model description
|
23 |
|
@@ -42,45 +42,179 @@ The following hyperparameters were used during training:
|
|
42 |
- seed: 42
|
43 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
44 |
- lr_scheduler_type: linear
|
45 |
-
- num_epochs:
|
46 |
|
47 |
### Training results
|
48 |
|
49 |
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
|
50 |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|
|
51 |
-
| No log | 0.0299 | 2 |
|
52 |
-
| No log | 0.0597 | 4 |
|
53 |
-
| No log | 0.0896 | 6 |
|
54 |
-
| No log | 0.1194 | 8 | 0.
|
55 |
-
| No log | 0.1493 | 10 | 0.
|
56 |
-
| No log | 0.1791 | 12 | 0.
|
57 |
-
| No log | 0.2090 | 14 | 0.
|
58 |
-
| No log | 0.2388 | 16 | 0.
|
59 |
-
| No log | 0.2687 | 18 | 0.
|
60 |
-
| No log | 0.2985 | 20 | 0.
|
61 |
-
| No log | 0.3284 | 22 | 0.
|
62 |
-
| No log | 0.3582 | 24 | 0.
|
63 |
-
| No log | 0.3881 | 26 | 0.
|
64 |
-
| No log | 0.4179 | 28 | 0.
|
65 |
-
| No log | 0.4478 | 30 | 0.
|
66 |
-
| No log | 0.4776 | 32 | 0.
|
67 |
-
| No log | 0.5075 | 34 | 0.
|
68 |
-
| No log | 0.5373 | 36 | 0.
|
69 |
-
| No log | 0.5672 | 38 | 0.
|
70 |
-
| No log | 0.5970 | 40 | 0.
|
71 |
-
| No log | 0.6269 | 42 | 0.
|
72 |
-
| No log | 0.6567 | 44 | 0.
|
73 |
-
| No log | 0.6866 | 46 | 0.
|
74 |
-
| No log | 0.7164 | 48 | 0.
|
75 |
-
| No log | 0.7463 | 50 | 0.
|
76 |
-
| No log | 0.7761 | 52 | 0.
|
77 |
-
| No log | 0.8060 | 54 | 0.
|
78 |
-
| No log | 0.8358 | 56 | 0.
|
79 |
-
| No log | 0.8657 | 58 | 0.
|
80 |
-
| No log | 0.8955 | 60 | 0.
|
81 |
-
| No log | 0.9254 | 62 | 0.
|
82 |
-
| No log | 0.9552 | 64 | 0.
|
83 |
-
| No log | 0.9851 | 66 | 0.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
84 |
|
85 |
|
86 |
### Framework versions
|
|
|
4 |
tags:
|
5 |
- generated_from_trainer
|
6 |
model-index:
|
7 |
+
- name: bert_baseline_prompt_adherence_task4_fold0
|
8 |
results: []
|
9 |
---
|
10 |
|
11 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
12 |
should probably proofread and complete it, then remove this comment. -->
|
13 |
|
14 |
+
# bert_baseline_prompt_adherence_task4_fold0
|
15 |
|
16 |
This model is a fine-tuned version of [google-bert/bert-base-cased](https://huggingface.co/google-bert/bert-base-cased) on the None dataset.
|
17 |
It achieves the following results on the evaluation set:
|
18 |
+
- Loss: 0.3287
|
19 |
+
- Qwk: 0.7248
|
20 |
+
- Mse: 0.3247
|
21 |
|
22 |
## Model description
|
23 |
|
|
|
42 |
- seed: 42
|
43 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
44 |
- lr_scheduler_type: linear
|
45 |
+
- num_epochs: 5
|
46 |
|
47 |
### Training results
|
48 |
|
49 |
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
|
50 |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|
|
51 |
+
| No log | 0.0299 | 2 | 0.9346 | 0.0 | 0.9328 |
|
52 |
+
| No log | 0.0597 | 4 | 0.8762 | 0.3376 | 0.8746 |
|
53 |
+
| No log | 0.0896 | 6 | 0.8267 | 0.3789 | 0.8251 |
|
54 |
+
| No log | 0.1194 | 8 | 0.7675 | 0.3809 | 0.7660 |
|
55 |
+
| No log | 0.1493 | 10 | 0.6965 | 0.3771 | 0.6951 |
|
56 |
+
| No log | 0.1791 | 12 | 0.6230 | 0.3658 | 0.6217 |
|
57 |
+
| No log | 0.2090 | 14 | 0.5320 | 0.3843 | 0.5292 |
|
58 |
+
| No log | 0.2388 | 16 | 0.4894 | 0.4044 | 0.4858 |
|
59 |
+
| No log | 0.2687 | 18 | 0.4757 | 0.4580 | 0.4718 |
|
60 |
+
| No log | 0.2985 | 20 | 0.4704 | 0.5660 | 0.4665 |
|
61 |
+
| No log | 0.3284 | 22 | 0.4632 | 0.5603 | 0.4594 |
|
62 |
+
| No log | 0.3582 | 24 | 0.4791 | 0.6068 | 0.4754 |
|
63 |
+
| No log | 0.3881 | 26 | 0.4743 | 0.5706 | 0.4708 |
|
64 |
+
| No log | 0.4179 | 28 | 0.5106 | 0.4297 | 0.5071 |
|
65 |
+
| No log | 0.4478 | 30 | 0.6764 | 0.2664 | 0.6729 |
|
66 |
+
| No log | 0.4776 | 32 | 0.5556 | 0.3762 | 0.5522 |
|
67 |
+
| No log | 0.5075 | 34 | 0.4133 | 0.5868 | 0.4101 |
|
68 |
+
| No log | 0.5373 | 36 | 0.4757 | 0.6707 | 0.4729 |
|
69 |
+
| No log | 0.5672 | 38 | 0.5453 | 0.6801 | 0.5429 |
|
70 |
+
| No log | 0.5970 | 40 | 0.5164 | 0.7037 | 0.5139 |
|
71 |
+
| No log | 0.6269 | 42 | 0.4243 | 0.6483 | 0.4214 |
|
72 |
+
| No log | 0.6567 | 44 | 0.4446 | 0.4431 | 0.4413 |
|
73 |
+
| No log | 0.6866 | 46 | 0.4980 | 0.3762 | 0.4944 |
|
74 |
+
| No log | 0.7164 | 48 | 0.4330 | 0.4366 | 0.4294 |
|
75 |
+
| No log | 0.7463 | 50 | 0.3883 | 0.6049 | 0.3849 |
|
76 |
+
| No log | 0.7761 | 52 | 0.4350 | 0.6756 | 0.4320 |
|
77 |
+
| No log | 0.8060 | 54 | 0.5041 | 0.6724 | 0.5014 |
|
78 |
+
| No log | 0.8358 | 56 | 0.4888 | 0.6499 | 0.4858 |
|
79 |
+
| No log | 0.8657 | 58 | 0.4641 | 0.5020 | 0.4606 |
|
80 |
+
| No log | 0.8955 | 60 | 0.4165 | 0.5062 | 0.4124 |
|
81 |
+
| No log | 0.9254 | 62 | 0.4079 | 0.4934 | 0.4035 |
|
82 |
+
| No log | 0.9552 | 64 | 0.4116 | 0.4844 | 0.4072 |
|
83 |
+
| No log | 0.9851 | 66 | 0.3856 | 0.5151 | 0.3814 |
|
84 |
+
| No log | 1.0149 | 68 | 0.3717 | 0.6360 | 0.3680 |
|
85 |
+
| No log | 1.0448 | 70 | 0.3834 | 0.6631 | 0.3802 |
|
86 |
+
| No log | 1.0746 | 72 | 0.3956 | 0.6770 | 0.3930 |
|
87 |
+
| No log | 1.1045 | 74 | 0.4079 | 0.6865 | 0.4056 |
|
88 |
+
| No log | 1.1343 | 76 | 0.3904 | 0.6947 | 0.3878 |
|
89 |
+
| No log | 1.1642 | 78 | 0.3679 | 0.6706 | 0.3648 |
|
90 |
+
| No log | 1.1940 | 80 | 0.3543 | 0.6581 | 0.3506 |
|
91 |
+
| No log | 1.2239 | 82 | 0.3671 | 0.5810 | 0.3629 |
|
92 |
+
| No log | 1.2537 | 84 | 0.3737 | 0.5696 | 0.3694 |
|
93 |
+
| No log | 1.2836 | 86 | 0.3454 | 0.6543 | 0.3412 |
|
94 |
+
| No log | 1.3134 | 88 | 0.3739 | 0.7156 | 0.3701 |
|
95 |
+
| No log | 1.3433 | 90 | 0.4277 | 0.7367 | 0.4246 |
|
96 |
+
| No log | 1.3731 | 92 | 0.3952 | 0.7268 | 0.3919 |
|
97 |
+
| No log | 1.4030 | 94 | 0.3791 | 0.7280 | 0.3759 |
|
98 |
+
| No log | 1.4328 | 96 | 0.3518 | 0.7048 | 0.3485 |
|
99 |
+
| No log | 1.4627 | 98 | 0.3241 | 0.6618 | 0.3202 |
|
100 |
+
| No log | 1.4925 | 100 | 0.3180 | 0.6617 | 0.3138 |
|
101 |
+
| No log | 1.5224 | 102 | 0.3201 | 0.6608 | 0.3158 |
|
102 |
+
| No log | 1.5522 | 104 | 0.3189 | 0.6797 | 0.3147 |
|
103 |
+
| No log | 1.5821 | 106 | 0.3211 | 0.7000 | 0.3171 |
|
104 |
+
| No log | 1.6119 | 108 | 0.3240 | 0.7032 | 0.3202 |
|
105 |
+
| No log | 1.6418 | 110 | 0.3182 | 0.6968 | 0.3144 |
|
106 |
+
| No log | 1.6716 | 112 | 0.3193 | 0.6978 | 0.3155 |
|
107 |
+
| No log | 1.7015 | 114 | 0.3208 | 0.6902 | 0.3170 |
|
108 |
+
| No log | 1.7313 | 116 | 0.3205 | 0.6987 | 0.3168 |
|
109 |
+
| No log | 1.7612 | 118 | 0.3217 | 0.6992 | 0.3180 |
|
110 |
+
| No log | 1.7910 | 120 | 0.3304 | 0.7156 | 0.3270 |
|
111 |
+
| No log | 1.8209 | 122 | 0.3171 | 0.7016 | 0.3136 |
|
112 |
+
| No log | 1.8507 | 124 | 0.3119 | 0.6527 | 0.3084 |
|
113 |
+
| No log | 1.8806 | 126 | 0.3146 | 0.6219 | 0.3111 |
|
114 |
+
| No log | 1.9104 | 128 | 0.3155 | 0.6275 | 0.3121 |
|
115 |
+
| No log | 1.9403 | 130 | 0.3193 | 0.6807 | 0.3163 |
|
116 |
+
| No log | 1.9701 | 132 | 0.3256 | 0.6961 | 0.3225 |
|
117 |
+
| No log | 2.0 | 134 | 0.3204 | 0.6759 | 0.3169 |
|
118 |
+
| No log | 2.0299 | 136 | 0.3228 | 0.6927 | 0.3192 |
|
119 |
+
| No log | 2.0597 | 138 | 0.3269 | 0.6914 | 0.3231 |
|
120 |
+
| No log | 2.0896 | 140 | 0.3358 | 0.6980 | 0.3319 |
|
121 |
+
| No log | 2.1194 | 142 | 0.3465 | 0.7191 | 0.3427 |
|
122 |
+
| No log | 2.1493 | 144 | 0.3720 | 0.7386 | 0.3684 |
|
123 |
+
| No log | 2.1791 | 146 | 0.3638 | 0.7394 | 0.3602 |
|
124 |
+
| No log | 2.2090 | 148 | 0.3208 | 0.7083 | 0.3168 |
|
125 |
+
| No log | 2.2388 | 150 | 0.3163 | 0.6588 | 0.3120 |
|
126 |
+
| No log | 2.2687 | 152 | 0.3150 | 0.6519 | 0.3109 |
|
127 |
+
| No log | 2.2985 | 154 | 0.3131 | 0.6978 | 0.3094 |
|
128 |
+
| No log | 2.3284 | 156 | 0.3521 | 0.7069 | 0.3491 |
|
129 |
+
| No log | 2.3582 | 158 | 0.3785 | 0.7286 | 0.3758 |
|
130 |
+
| No log | 2.3881 | 160 | 0.3664 | 0.7315 | 0.3636 |
|
131 |
+
| No log | 2.4179 | 162 | 0.3289 | 0.7003 | 0.3256 |
|
132 |
+
| No log | 2.4478 | 164 | 0.3151 | 0.6734 | 0.3113 |
|
133 |
+
| No log | 2.4776 | 166 | 0.3182 | 0.6390 | 0.3144 |
|
134 |
+
| No log | 2.5075 | 168 | 0.3112 | 0.6862 | 0.3077 |
|
135 |
+
| No log | 2.5373 | 170 | 0.3238 | 0.7096 | 0.3208 |
|
136 |
+
| No log | 2.5672 | 172 | 0.3357 | 0.7111 | 0.3329 |
|
137 |
+
| No log | 2.5970 | 174 | 0.3280 | 0.7128 | 0.3250 |
|
138 |
+
| No log | 2.6269 | 176 | 0.3196 | 0.7017 | 0.3163 |
|
139 |
+
| No log | 2.6567 | 178 | 0.3192 | 0.6981 | 0.3157 |
|
140 |
+
| No log | 2.6866 | 180 | 0.3262 | 0.7028 | 0.3228 |
|
141 |
+
| No log | 2.7164 | 182 | 0.3285 | 0.7151 | 0.3250 |
|
142 |
+
| No log | 2.7463 | 184 | 0.3391 | 0.7314 | 0.3356 |
|
143 |
+
| No log | 2.7761 | 186 | 0.3273 | 0.7103 | 0.3235 |
|
144 |
+
| No log | 2.8060 | 188 | 0.3270 | 0.7039 | 0.3232 |
|
145 |
+
| No log | 2.8358 | 190 | 0.3225 | 0.6956 | 0.3187 |
|
146 |
+
| No log | 2.8657 | 192 | 0.3170 | 0.6903 | 0.3133 |
|
147 |
+
| No log | 2.8955 | 194 | 0.3177 | 0.7129 | 0.3142 |
|
148 |
+
| No log | 2.9254 | 196 | 0.3411 | 0.7205 | 0.3379 |
|
149 |
+
| No log | 2.9552 | 198 | 0.3547 | 0.7257 | 0.3518 |
|
150 |
+
| No log | 2.9851 | 200 | 0.3820 | 0.7394 | 0.3794 |
|
151 |
+
| No log | 3.0149 | 202 | 0.3734 | 0.7259 | 0.3707 |
|
152 |
+
| No log | 3.0448 | 204 | 0.3504 | 0.7125 | 0.3477 |
|
153 |
+
| No log | 3.0746 | 206 | 0.3331 | 0.7062 | 0.3303 |
|
154 |
+
| No log | 3.1045 | 208 | 0.3297 | 0.7034 | 0.3268 |
|
155 |
+
| No log | 3.1343 | 210 | 0.3148 | 0.7056 | 0.3116 |
|
156 |
+
| No log | 3.1642 | 212 | 0.3094 | 0.6964 | 0.3059 |
|
157 |
+
| No log | 3.1940 | 214 | 0.3136 | 0.7063 | 0.3100 |
|
158 |
+
| No log | 3.2239 | 216 | 0.3093 | 0.6969 | 0.3055 |
|
159 |
+
| No log | 3.2537 | 218 | 0.3175 | 0.7023 | 0.3136 |
|
160 |
+
| No log | 3.2836 | 220 | 0.3284 | 0.7090 | 0.3245 |
|
161 |
+
| No log | 3.3134 | 222 | 0.3512 | 0.7502 | 0.3474 |
|
162 |
+
| No log | 3.3433 | 224 | 0.3781 | 0.7593 | 0.3745 |
|
163 |
+
| No log | 3.3731 | 226 | 0.3766 | 0.7631 | 0.3730 |
|
164 |
+
| No log | 3.4030 | 228 | 0.3435 | 0.7400 | 0.3397 |
|
165 |
+
| No log | 3.4328 | 230 | 0.3188 | 0.6970 | 0.3148 |
|
166 |
+
| No log | 3.4627 | 232 | 0.3182 | 0.6951 | 0.3142 |
|
167 |
+
| No log | 3.4925 | 234 | 0.3306 | 0.7232 | 0.3268 |
|
168 |
+
| No log | 3.5224 | 236 | 0.3722 | 0.7445 | 0.3687 |
|
169 |
+
| No log | 3.5522 | 238 | 0.4429 | 0.7601 | 0.4397 |
|
170 |
+
| No log | 3.5821 | 240 | 0.4691 | 0.7670 | 0.4660 |
|
171 |
+
| No log | 3.6119 | 242 | 0.4352 | 0.7723 | 0.4317 |
|
172 |
+
| No log | 3.6418 | 244 | 0.3737 | 0.7367 | 0.3698 |
|
173 |
+
| No log | 3.6716 | 246 | 0.3465 | 0.7244 | 0.3422 |
|
174 |
+
| No log | 3.7015 | 248 | 0.3303 | 0.7140 | 0.3257 |
|
175 |
+
| No log | 3.7313 | 250 | 0.3240 | 0.7111 | 0.3192 |
|
176 |
+
| No log | 3.7612 | 252 | 0.3242 | 0.7154 | 0.3193 |
|
177 |
+
| No log | 3.7910 | 254 | 0.3226 | 0.7155 | 0.3176 |
|
178 |
+
| No log | 3.8209 | 256 | 0.3210 | 0.7185 | 0.3162 |
|
179 |
+
| No log | 3.8507 | 258 | 0.3214 | 0.7133 | 0.3167 |
|
180 |
+
| No log | 3.8806 | 260 | 0.3333 | 0.7208 | 0.3289 |
|
181 |
+
| No log | 3.9104 | 262 | 0.3389 | 0.7181 | 0.3347 |
|
182 |
+
| No log | 3.9403 | 264 | 0.3283 | 0.7208 | 0.3240 |
|
183 |
+
| No log | 3.9701 | 266 | 0.3182 | 0.7157 | 0.3137 |
|
184 |
+
| No log | 4.0 | 268 | 0.3059 | 0.7117 | 0.3012 |
|
185 |
+
| No log | 4.0299 | 270 | 0.3039 | 0.6871 | 0.2992 |
|
186 |
+
| No log | 4.0597 | 272 | 0.3037 | 0.6848 | 0.2990 |
|
187 |
+
| No log | 4.0896 | 274 | 0.3022 | 0.7070 | 0.2977 |
|
188 |
+
| No log | 4.1194 | 276 | 0.3065 | 0.7124 | 0.3022 |
|
189 |
+
| No log | 4.1493 | 278 | 0.3156 | 0.7123 | 0.3116 |
|
190 |
+
| No log | 4.1791 | 280 | 0.3377 | 0.7243 | 0.3341 |
|
191 |
+
| No log | 4.2090 | 282 | 0.3639 | 0.7213 | 0.3605 |
|
192 |
+
| No log | 4.2388 | 284 | 0.3704 | 0.7328 | 0.3671 |
|
193 |
+
| No log | 4.2687 | 286 | 0.3574 | 0.7297 | 0.3539 |
|
194 |
+
| No log | 4.2985 | 288 | 0.3356 | 0.7157 | 0.3318 |
|
195 |
+
| No log | 4.3284 | 290 | 0.3173 | 0.7171 | 0.3133 |
|
196 |
+
| No log | 4.3582 | 292 | 0.3088 | 0.7137 | 0.3046 |
|
197 |
+
| No log | 4.3881 | 294 | 0.3061 | 0.7066 | 0.3018 |
|
198 |
+
| No log | 4.4179 | 296 | 0.3072 | 0.7126 | 0.3029 |
|
199 |
+
| No log | 4.4478 | 298 | 0.3102 | 0.7178 | 0.3060 |
|
200 |
+
| No log | 4.4776 | 300 | 0.3153 | 0.7152 | 0.3111 |
|
201 |
+
| No log | 4.5075 | 302 | 0.3223 | 0.7218 | 0.3182 |
|
202 |
+
| No log | 4.5373 | 304 | 0.3316 | 0.7264 | 0.3276 |
|
203 |
+
| No log | 4.5672 | 306 | 0.3417 | 0.7355 | 0.3378 |
|
204 |
+
| No log | 4.5970 | 308 | 0.3499 | 0.7352 | 0.3462 |
|
205 |
+
| No log | 4.6269 | 310 | 0.3548 | 0.7438 | 0.3511 |
|
206 |
+
| No log | 4.6567 | 312 | 0.3536 | 0.7427 | 0.3499 |
|
207 |
+
| No log | 4.6866 | 314 | 0.3519 | 0.7407 | 0.3482 |
|
208 |
+
| No log | 4.7164 | 316 | 0.3472 | 0.7299 | 0.3434 |
|
209 |
+
| No log | 4.7463 | 318 | 0.3398 | 0.7326 | 0.3359 |
|
210 |
+
| No log | 4.7761 | 320 | 0.3326 | 0.7288 | 0.3286 |
|
211 |
+
| No log | 4.8060 | 322 | 0.3284 | 0.7268 | 0.3245 |
|
212 |
+
| No log | 4.8358 | 324 | 0.3274 | 0.7248 | 0.3234 |
|
213 |
+
| No log | 4.8657 | 326 | 0.3274 | 0.7248 | 0.3235 |
|
214 |
+
| No log | 4.8955 | 328 | 0.3284 | 0.7248 | 0.3244 |
|
215 |
+
| No log | 4.9254 | 330 | 0.3289 | 0.7268 | 0.3250 |
|
216 |
+
| No log | 4.9552 | 332 | 0.3288 | 0.7268 | 0.3248 |
|
217 |
+
| No log | 4.9851 | 334 | 0.3287 | 0.7248 | 0.3247 |
|
218 |
|
219 |
|
220 |
### Framework versions
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 433267692
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:7e137f852787bb963d4a8ba8b8925658c1045bb79952d1b5bee782471c99fb57
|
3 |
size 433267692
|
runs/Aug22_10-22-58_0095ffe889f2/events.out.tfevents.1724322179.0095ffe889f2.25.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:673f097644d8d0f75897c97e97585c7f0ee76bcdc115d91761bd4ed178309d9c
|
3 |
+
size 63971
|
runs/Aug22_10-45-22_0095ffe889f2/events.out.tfevents.1724323523.0095ffe889f2.25.1
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1ebda0e34cb83be08cdea5cefc0c57618aa0899ef4d8a26edf843106fe8a0468
|
3 |
+
size 63971
|
runs/Aug22_11-07-47_0095ffe889f2/events.out.tfevents.1724324868.0095ffe889f2.25.2
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:874c807cee086c1d72ca7c792e67ef154ce5374ffce5f9abaaa485d6298a0cce
|
3 |
+
size 63971
|
runs/Aug22_11-30-11_0095ffe889f2/events.out.tfevents.1724326212.0095ffe889f2.25.3
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:09b46bf89787d3a48dbe1b63242683ee3094a427ed82ee047cebe8e91a9b583b
|
3 |
+
size 63971
|
runs/Aug22_11-52-37_0095ffe889f2/events.out.tfevents.1724327557.0095ffe889f2.25.4
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:387e071371e64d86ea139ad0bc8687fdbdfe33bb792e775b4893ac72cb6a7a2e
|
3 |
+
size 63971
|
runs/Aug22_12-15-04_0095ffe889f2/events.out.tfevents.1724328905.0095ffe889f2.25.5
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:4c0192001d8dfd988cbe3ce2a034448b244ddc7f293be3ff923cff9ec9e3c5a6
|
3 |
+
size 65796
|
runs/Aug22_12-38-55_0095ffe889f2/events.out.tfevents.1724330336.0095ffe889f2.25.6
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:2c25e26d9946cdcb8f123c1592d9c43e30b1ad92a28160c8f964dc44f7362271
|
3 |
+
size 65796
|
training_args.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 5176
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:236bae5127e8a072c495d513cb43c00eae51bfc7cd81dd2a5bbd42d700f8a76a
|
3 |
size 5176
|