DouglasBraga commited on
Commit
18b86e4
1 Parent(s): 5cb7f79

End of training

Browse files
Files changed (5) hide show
  1. README.md +3 -3
  2. all_results.json +13 -0
  3. eval_results.json +8 -0
  4. train_results.json +8 -0
  5. trainer_state.json +2316 -0
README.md CHANGED
@@ -23,7 +23,7 @@ model-index:
23
  metrics:
24
  - name: Accuracy
25
  type: accuracy
26
- value: 0.8610328638497653
27
  ---
28
 
29
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -33,8 +33,8 @@ should probably proofread and complete it, then remove this comment. -->
33
 
34
  This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
35
  It achieves the following results on the evaluation set:
36
- - Loss: 0.7581
37
- - Accuracy: 0.8610
38
 
39
  ## Model description
40
 
 
23
  metrics:
24
  - name: Accuracy
25
  type: accuracy
26
+ value: 0.8892018779342723
27
  ---
28
 
29
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
33
 
34
  This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
35
  It achieves the following results on the evaluation set:
36
+ - Loss: 0.3631
37
+ - Accuracy: 0.8892
38
 
39
  ## Model description
40
 
all_results.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 9.984,
3
+ "eval_accuracy": 0.8892018779342723,
4
+ "eval_loss": 0.3630879521369934,
5
+ "eval_runtime": 16.4219,
6
+ "eval_samples_per_second": 64.852,
7
+ "eval_steps_per_second": 2.07,
8
+ "total_flos": 9.926487761391452e+18,
9
+ "train_loss": 0.15716634974934351,
10
+ "train_runtime": 142469.0271,
11
+ "train_samples_per_second": 2.808,
12
+ "train_steps_per_second": 0.022
13
+ }
eval_results.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 9.984,
3
+ "eval_accuracy": 0.8892018779342723,
4
+ "eval_loss": 0.3630879521369934,
5
+ "eval_runtime": 16.4219,
6
+ "eval_samples_per_second": 64.852,
7
+ "eval_steps_per_second": 2.07
8
+ }
train_results.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 9.984,
3
+ "total_flos": 9.926487761391452e+18,
4
+ "train_loss": 0.15716634974934351,
5
+ "train_runtime": 142469.0271,
6
+ "train_samples_per_second": 2.808,
7
+ "train_steps_per_second": 0.022
8
+ }
trainer_state.json ADDED
@@ -0,0 +1,2316 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": 0.8892018779342723,
3
+ "best_model_checkpoint": "swin-tiny-patch4-window7-224-finetuned-leukemia-08-2024.v1.1\\checkpoint-2187",
4
+ "epoch": 9.984,
5
+ "eval_steps": 500,
6
+ "global_step": 3120,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.032,
13
+ "grad_norm": 2.98797607421875,
14
+ "learning_rate": 1.6025641025641025e-06,
15
+ "loss": 0.712,
16
+ "step": 10
17
+ },
18
+ {
19
+ "epoch": 0.064,
20
+ "grad_norm": 6.277791976928711,
21
+ "learning_rate": 3.205128205128205e-06,
22
+ "loss": 0.6981,
23
+ "step": 20
24
+ },
25
+ {
26
+ "epoch": 0.096,
27
+ "grad_norm": 5.27135705947876,
28
+ "learning_rate": 4.807692307692308e-06,
29
+ "loss": 0.6783,
30
+ "step": 30
31
+ },
32
+ {
33
+ "epoch": 0.128,
34
+ "grad_norm": 6.831840991973877,
35
+ "learning_rate": 6.41025641025641e-06,
36
+ "loss": 0.6498,
37
+ "step": 40
38
+ },
39
+ {
40
+ "epoch": 0.16,
41
+ "grad_norm": 8.679888725280762,
42
+ "learning_rate": 8.012820512820515e-06,
43
+ "loss": 0.6079,
44
+ "step": 50
45
+ },
46
+ {
47
+ "epoch": 0.192,
48
+ "grad_norm": 11.260509490966797,
49
+ "learning_rate": 9.615384615384616e-06,
50
+ "loss": 0.579,
51
+ "step": 60
52
+ },
53
+ {
54
+ "epoch": 0.224,
55
+ "grad_norm": 17.14796257019043,
56
+ "learning_rate": 1.1217948717948719e-05,
57
+ "loss": 0.5673,
58
+ "step": 70
59
+ },
60
+ {
61
+ "epoch": 0.256,
62
+ "grad_norm": 10.239635467529297,
63
+ "learning_rate": 1.282051282051282e-05,
64
+ "loss": 0.5263,
65
+ "step": 80
66
+ },
67
+ {
68
+ "epoch": 0.288,
69
+ "grad_norm": 7.6730055809021,
70
+ "learning_rate": 1.4423076923076923e-05,
71
+ "loss": 0.5371,
72
+ "step": 90
73
+ },
74
+ {
75
+ "epoch": 0.32,
76
+ "grad_norm": 7.861423492431641,
77
+ "learning_rate": 1.602564102564103e-05,
78
+ "loss": 0.5144,
79
+ "step": 100
80
+ },
81
+ {
82
+ "epoch": 0.352,
83
+ "grad_norm": 9.051122665405273,
84
+ "learning_rate": 1.762820512820513e-05,
85
+ "loss": 0.5046,
86
+ "step": 110
87
+ },
88
+ {
89
+ "epoch": 0.384,
90
+ "grad_norm": 14.384810447692871,
91
+ "learning_rate": 1.923076923076923e-05,
92
+ "loss": 0.4784,
93
+ "step": 120
94
+ },
95
+ {
96
+ "epoch": 0.416,
97
+ "grad_norm": 12.508864402770996,
98
+ "learning_rate": 2.0833333333333336e-05,
99
+ "loss": 0.4626,
100
+ "step": 130
101
+ },
102
+ {
103
+ "epoch": 0.448,
104
+ "grad_norm": 14.830451011657715,
105
+ "learning_rate": 2.2435897435897437e-05,
106
+ "loss": 0.4459,
107
+ "step": 140
108
+ },
109
+ {
110
+ "epoch": 0.48,
111
+ "grad_norm": 52.42909622192383,
112
+ "learning_rate": 2.4038461538461542e-05,
113
+ "loss": 0.4163,
114
+ "step": 150
115
+ },
116
+ {
117
+ "epoch": 0.512,
118
+ "grad_norm": 32.466487884521484,
119
+ "learning_rate": 2.564102564102564e-05,
120
+ "loss": 0.4801,
121
+ "step": 160
122
+ },
123
+ {
124
+ "epoch": 0.544,
125
+ "grad_norm": 30.3638973236084,
126
+ "learning_rate": 2.724358974358974e-05,
127
+ "loss": 0.4896,
128
+ "step": 170
129
+ },
130
+ {
131
+ "epoch": 0.576,
132
+ "grad_norm": 27.437076568603516,
133
+ "learning_rate": 2.8846153846153845e-05,
134
+ "loss": 0.4598,
135
+ "step": 180
136
+ },
137
+ {
138
+ "epoch": 0.608,
139
+ "grad_norm": 11.979604721069336,
140
+ "learning_rate": 3.0448717948717947e-05,
141
+ "loss": 0.421,
142
+ "step": 190
143
+ },
144
+ {
145
+ "epoch": 0.64,
146
+ "grad_norm": 68.23617553710938,
147
+ "learning_rate": 3.205128205128206e-05,
148
+ "loss": 0.4518,
149
+ "step": 200
150
+ },
151
+ {
152
+ "epoch": 0.672,
153
+ "grad_norm": 41.465309143066406,
154
+ "learning_rate": 3.365384615384616e-05,
155
+ "loss": 0.4403,
156
+ "step": 210
157
+ },
158
+ {
159
+ "epoch": 0.704,
160
+ "grad_norm": 14.963610649108887,
161
+ "learning_rate": 3.525641025641026e-05,
162
+ "loss": 0.4282,
163
+ "step": 220
164
+ },
165
+ {
166
+ "epoch": 0.736,
167
+ "grad_norm": 16.748451232910156,
168
+ "learning_rate": 3.685897435897436e-05,
169
+ "loss": 0.3553,
170
+ "step": 230
171
+ },
172
+ {
173
+ "epoch": 0.768,
174
+ "grad_norm": 47.92767333984375,
175
+ "learning_rate": 3.846153846153846e-05,
176
+ "loss": 0.4068,
177
+ "step": 240
178
+ },
179
+ {
180
+ "epoch": 0.8,
181
+ "grad_norm": 32.47563552856445,
182
+ "learning_rate": 4.006410256410257e-05,
183
+ "loss": 0.4356,
184
+ "step": 250
185
+ },
186
+ {
187
+ "epoch": 0.832,
188
+ "grad_norm": 32.39358901977539,
189
+ "learning_rate": 4.166666666666667e-05,
190
+ "loss": 0.3873,
191
+ "step": 260
192
+ },
193
+ {
194
+ "epoch": 0.864,
195
+ "grad_norm": 17.5948429107666,
196
+ "learning_rate": 4.326923076923077e-05,
197
+ "loss": 0.3428,
198
+ "step": 270
199
+ },
200
+ {
201
+ "epoch": 0.896,
202
+ "grad_norm": 34.92612075805664,
203
+ "learning_rate": 4.4871794871794874e-05,
204
+ "loss": 0.4007,
205
+ "step": 280
206
+ },
207
+ {
208
+ "epoch": 0.928,
209
+ "grad_norm": 12.984992980957031,
210
+ "learning_rate": 4.6474358974358976e-05,
211
+ "loss": 0.4148,
212
+ "step": 290
213
+ },
214
+ {
215
+ "epoch": 0.96,
216
+ "grad_norm": 62.389808654785156,
217
+ "learning_rate": 4.8076923076923084e-05,
218
+ "loss": 0.3258,
219
+ "step": 300
220
+ },
221
+ {
222
+ "epoch": 0.992,
223
+ "grad_norm": 20.061573028564453,
224
+ "learning_rate": 4.9679487179487185e-05,
225
+ "loss": 0.32,
226
+ "step": 310
227
+ },
228
+ {
229
+ "epoch": 0.9984,
230
+ "eval_accuracy": 0.6910798122065728,
231
+ "eval_loss": 0.9902069568634033,
232
+ "eval_runtime": 15.0205,
233
+ "eval_samples_per_second": 70.903,
234
+ "eval_steps_per_second": 2.264,
235
+ "step": 312
236
+ },
237
+ {
238
+ "epoch": 1.024,
239
+ "grad_norm": 95.87650299072266,
240
+ "learning_rate": 4.985754985754986e-05,
241
+ "loss": 0.4095,
242
+ "step": 320
243
+ },
244
+ {
245
+ "epoch": 1.056,
246
+ "grad_norm": 15.716072082519531,
247
+ "learning_rate": 4.9679487179487185e-05,
248
+ "loss": 0.3434,
249
+ "step": 330
250
+ },
251
+ {
252
+ "epoch": 1.088,
253
+ "grad_norm": 24.56471061706543,
254
+ "learning_rate": 4.95014245014245e-05,
255
+ "loss": 0.3115,
256
+ "step": 340
257
+ },
258
+ {
259
+ "epoch": 1.12,
260
+ "grad_norm": 27.441110610961914,
261
+ "learning_rate": 4.932336182336182e-05,
262
+ "loss": 0.3818,
263
+ "step": 350
264
+ },
265
+ {
266
+ "epoch": 1.152,
267
+ "grad_norm": 19.63530921936035,
268
+ "learning_rate": 4.9145299145299147e-05,
269
+ "loss": 0.4001,
270
+ "step": 360
271
+ },
272
+ {
273
+ "epoch": 1.184,
274
+ "grad_norm": 59.710182189941406,
275
+ "learning_rate": 4.896723646723647e-05,
276
+ "loss": 0.3836,
277
+ "step": 370
278
+ },
279
+ {
280
+ "epoch": 1.216,
281
+ "grad_norm": 11.739165306091309,
282
+ "learning_rate": 4.878917378917379e-05,
283
+ "loss": 0.3505,
284
+ "step": 380
285
+ },
286
+ {
287
+ "epoch": 1.248,
288
+ "grad_norm": 9.130122184753418,
289
+ "learning_rate": 4.8611111111111115e-05,
290
+ "loss": 0.2724,
291
+ "step": 390
292
+ },
293
+ {
294
+ "epoch": 1.28,
295
+ "grad_norm": 35.71299362182617,
296
+ "learning_rate": 4.8433048433048433e-05,
297
+ "loss": 0.2992,
298
+ "step": 400
299
+ },
300
+ {
301
+ "epoch": 1.312,
302
+ "grad_norm": 23.9375,
303
+ "learning_rate": 4.825498575498576e-05,
304
+ "loss": 0.3046,
305
+ "step": 410
306
+ },
307
+ {
308
+ "epoch": 1.3439999999999999,
309
+ "grad_norm": 25.04848289489746,
310
+ "learning_rate": 4.8076923076923084e-05,
311
+ "loss": 0.315,
312
+ "step": 420
313
+ },
314
+ {
315
+ "epoch": 1.376,
316
+ "grad_norm": 18.383527755737305,
317
+ "learning_rate": 4.78988603988604e-05,
318
+ "loss": 0.3109,
319
+ "step": 430
320
+ },
321
+ {
322
+ "epoch": 1.408,
323
+ "grad_norm": 13.666149139404297,
324
+ "learning_rate": 4.772079772079772e-05,
325
+ "loss": 0.2999,
326
+ "step": 440
327
+ },
328
+ {
329
+ "epoch": 1.44,
330
+ "grad_norm": 9.764185905456543,
331
+ "learning_rate": 4.7542735042735045e-05,
332
+ "loss": 0.2615,
333
+ "step": 450
334
+ },
335
+ {
336
+ "epoch": 1.472,
337
+ "grad_norm": 15.155381202697754,
338
+ "learning_rate": 4.736467236467237e-05,
339
+ "loss": 0.261,
340
+ "step": 460
341
+ },
342
+ {
343
+ "epoch": 1.504,
344
+ "grad_norm": 14.417835235595703,
345
+ "learning_rate": 4.718660968660969e-05,
346
+ "loss": 0.2596,
347
+ "step": 470
348
+ },
349
+ {
350
+ "epoch": 1.536,
351
+ "grad_norm": 26.80813217163086,
352
+ "learning_rate": 4.700854700854701e-05,
353
+ "loss": 0.3011,
354
+ "step": 480
355
+ },
356
+ {
357
+ "epoch": 1.568,
358
+ "grad_norm": 29.056140899658203,
359
+ "learning_rate": 4.683048433048433e-05,
360
+ "loss": 0.2741,
361
+ "step": 490
362
+ },
363
+ {
364
+ "epoch": 1.6,
365
+ "grad_norm": 8.897234916687012,
366
+ "learning_rate": 4.665242165242166e-05,
367
+ "loss": 0.2906,
368
+ "step": 500
369
+ },
370
+ {
371
+ "epoch": 1.6320000000000001,
372
+ "grad_norm": 15.488897323608398,
373
+ "learning_rate": 4.6474358974358976e-05,
374
+ "loss": 0.3148,
375
+ "step": 510
376
+ },
377
+ {
378
+ "epoch": 1.6640000000000001,
379
+ "grad_norm": 45.335018157958984,
380
+ "learning_rate": 4.62962962962963e-05,
381
+ "loss": 0.3026,
382
+ "step": 520
383
+ },
384
+ {
385
+ "epoch": 1.696,
386
+ "grad_norm": 8.628056526184082,
387
+ "learning_rate": 4.611823361823362e-05,
388
+ "loss": 0.2719,
389
+ "step": 530
390
+ },
391
+ {
392
+ "epoch": 1.728,
393
+ "grad_norm": 11.930350303649902,
394
+ "learning_rate": 4.594017094017094e-05,
395
+ "loss": 0.2831,
396
+ "step": 540
397
+ },
398
+ {
399
+ "epoch": 1.76,
400
+ "grad_norm": 7.246833324432373,
401
+ "learning_rate": 4.576210826210827e-05,
402
+ "loss": 0.2292,
403
+ "step": 550
404
+ },
405
+ {
406
+ "epoch": 1.792,
407
+ "grad_norm": 31.572784423828125,
408
+ "learning_rate": 4.558404558404559e-05,
409
+ "loss": 0.2698,
410
+ "step": 560
411
+ },
412
+ {
413
+ "epoch": 1.8239999999999998,
414
+ "grad_norm": 9.255568504333496,
415
+ "learning_rate": 4.5405982905982906e-05,
416
+ "loss": 0.2469,
417
+ "step": 570
418
+ },
419
+ {
420
+ "epoch": 1.8559999999999999,
421
+ "grad_norm": 13.890963554382324,
422
+ "learning_rate": 4.522792022792023e-05,
423
+ "loss": 0.2614,
424
+ "step": 580
425
+ },
426
+ {
427
+ "epoch": 1.888,
428
+ "grad_norm": 13.776374816894531,
429
+ "learning_rate": 4.504985754985755e-05,
430
+ "loss": 0.264,
431
+ "step": 590
432
+ },
433
+ {
434
+ "epoch": 1.92,
435
+ "grad_norm": 7.997173309326172,
436
+ "learning_rate": 4.4871794871794874e-05,
437
+ "loss": 0.2548,
438
+ "step": 600
439
+ },
440
+ {
441
+ "epoch": 1.952,
442
+ "grad_norm": 44.6745719909668,
443
+ "learning_rate": 4.46937321937322e-05,
444
+ "loss": 0.2476,
445
+ "step": 610
446
+ },
447
+ {
448
+ "epoch": 1.984,
449
+ "grad_norm": 13.378036499023438,
450
+ "learning_rate": 4.451566951566952e-05,
451
+ "loss": 0.2625,
452
+ "step": 620
453
+ },
454
+ {
455
+ "epoch": 2.0,
456
+ "eval_accuracy": 0.7690140845070422,
457
+ "eval_loss": 0.5525702238082886,
458
+ "eval_runtime": 14.9955,
459
+ "eval_samples_per_second": 71.021,
460
+ "eval_steps_per_second": 2.267,
461
+ "step": 625
462
+ },
463
+ {
464
+ "epoch": 2.016,
465
+ "grad_norm": 7.805148601531982,
466
+ "learning_rate": 4.4337606837606836e-05,
467
+ "loss": 0.2232,
468
+ "step": 630
469
+ },
470
+ {
471
+ "epoch": 2.048,
472
+ "grad_norm": 10.30744457244873,
473
+ "learning_rate": 4.415954415954416e-05,
474
+ "loss": 0.2577,
475
+ "step": 640
476
+ },
477
+ {
478
+ "epoch": 2.08,
479
+ "grad_norm": 15.62030029296875,
480
+ "learning_rate": 4.3981481481481486e-05,
481
+ "loss": 0.2273,
482
+ "step": 650
483
+ },
484
+ {
485
+ "epoch": 2.112,
486
+ "grad_norm": 28.10418128967285,
487
+ "learning_rate": 4.3803418803418805e-05,
488
+ "loss": 0.243,
489
+ "step": 660
490
+ },
491
+ {
492
+ "epoch": 2.144,
493
+ "grad_norm": 9.574631690979004,
494
+ "learning_rate": 4.362535612535612e-05,
495
+ "loss": 0.2286,
496
+ "step": 670
497
+ },
498
+ {
499
+ "epoch": 2.176,
500
+ "grad_norm": 14.983197212219238,
501
+ "learning_rate": 4.344729344729345e-05,
502
+ "loss": 0.2173,
503
+ "step": 680
504
+ },
505
+ {
506
+ "epoch": 2.208,
507
+ "grad_norm": 19.074600219726562,
508
+ "learning_rate": 4.326923076923077e-05,
509
+ "loss": 0.2339,
510
+ "step": 690
511
+ },
512
+ {
513
+ "epoch": 2.24,
514
+ "grad_norm": 27.23953628540039,
515
+ "learning_rate": 4.309116809116809e-05,
516
+ "loss": 0.2008,
517
+ "step": 700
518
+ },
519
+ {
520
+ "epoch": 2.2720000000000002,
521
+ "grad_norm": 13.12559986114502,
522
+ "learning_rate": 4.291310541310542e-05,
523
+ "loss": 0.1996,
524
+ "step": 710
525
+ },
526
+ {
527
+ "epoch": 2.304,
528
+ "grad_norm": 22.73211097717285,
529
+ "learning_rate": 4.2735042735042735e-05,
530
+ "loss": 0.1986,
531
+ "step": 720
532
+ },
533
+ {
534
+ "epoch": 2.336,
535
+ "grad_norm": 12.57758903503418,
536
+ "learning_rate": 4.255698005698006e-05,
537
+ "loss": 0.2283,
538
+ "step": 730
539
+ },
540
+ {
541
+ "epoch": 2.368,
542
+ "grad_norm": 61.22172927856445,
543
+ "learning_rate": 4.2378917378917385e-05,
544
+ "loss": 0.2175,
545
+ "step": 740
546
+ },
547
+ {
548
+ "epoch": 2.4,
549
+ "grad_norm": 14.797411918640137,
550
+ "learning_rate": 4.2200854700854704e-05,
551
+ "loss": 0.1758,
552
+ "step": 750
553
+ },
554
+ {
555
+ "epoch": 2.432,
556
+ "grad_norm": 9.504317283630371,
557
+ "learning_rate": 4.202279202279202e-05,
558
+ "loss": 0.2087,
559
+ "step": 760
560
+ },
561
+ {
562
+ "epoch": 2.464,
563
+ "grad_norm": 8.436298370361328,
564
+ "learning_rate": 4.184472934472935e-05,
565
+ "loss": 0.2318,
566
+ "step": 770
567
+ },
568
+ {
569
+ "epoch": 2.496,
570
+ "grad_norm": 31.693523406982422,
571
+ "learning_rate": 4.166666666666667e-05,
572
+ "loss": 0.1909,
573
+ "step": 780
574
+ },
575
+ {
576
+ "epoch": 2.528,
577
+ "grad_norm": 14.50406265258789,
578
+ "learning_rate": 4.148860398860399e-05,
579
+ "loss": 0.2157,
580
+ "step": 790
581
+ },
582
+ {
583
+ "epoch": 2.56,
584
+ "grad_norm": 25.511077880859375,
585
+ "learning_rate": 4.131054131054131e-05,
586
+ "loss": 0.2237,
587
+ "step": 800
588
+ },
589
+ {
590
+ "epoch": 2.592,
591
+ "grad_norm": 6.311196804046631,
592
+ "learning_rate": 4.1132478632478634e-05,
593
+ "loss": 0.1922,
594
+ "step": 810
595
+ },
596
+ {
597
+ "epoch": 2.624,
598
+ "grad_norm": 37.99801254272461,
599
+ "learning_rate": 4.095441595441596e-05,
600
+ "loss": 0.1835,
601
+ "step": 820
602
+ },
603
+ {
604
+ "epoch": 2.656,
605
+ "grad_norm": 48.04703140258789,
606
+ "learning_rate": 4.077635327635328e-05,
607
+ "loss": 0.191,
608
+ "step": 830
609
+ },
610
+ {
611
+ "epoch": 2.6879999999999997,
612
+ "grad_norm": 52.908992767333984,
613
+ "learning_rate": 4.05982905982906e-05,
614
+ "loss": 0.2908,
615
+ "step": 840
616
+ },
617
+ {
618
+ "epoch": 2.7199999999999998,
619
+ "grad_norm": 8.987102508544922,
620
+ "learning_rate": 4.042022792022792e-05,
621
+ "loss": 0.216,
622
+ "step": 850
623
+ },
624
+ {
625
+ "epoch": 2.752,
626
+ "grad_norm": 44.67879104614258,
627
+ "learning_rate": 4.024216524216524e-05,
628
+ "loss": 0.1991,
629
+ "step": 860
630
+ },
631
+ {
632
+ "epoch": 2.784,
633
+ "grad_norm": 23.197927474975586,
634
+ "learning_rate": 4.006410256410257e-05,
635
+ "loss": 0.1751,
636
+ "step": 870
637
+ },
638
+ {
639
+ "epoch": 2.816,
640
+ "grad_norm": 29.43416404724121,
641
+ "learning_rate": 3.988603988603989e-05,
642
+ "loss": 0.2064,
643
+ "step": 880
644
+ },
645
+ {
646
+ "epoch": 2.848,
647
+ "grad_norm": 10.251945495605469,
648
+ "learning_rate": 3.970797720797721e-05,
649
+ "loss": 0.178,
650
+ "step": 890
651
+ },
652
+ {
653
+ "epoch": 2.88,
654
+ "grad_norm": 29.398643493652344,
655
+ "learning_rate": 3.952991452991453e-05,
656
+ "loss": 0.1813,
657
+ "step": 900
658
+ },
659
+ {
660
+ "epoch": 2.912,
661
+ "grad_norm": 37.31789016723633,
662
+ "learning_rate": 3.935185185185186e-05,
663
+ "loss": 0.1778,
664
+ "step": 910
665
+ },
666
+ {
667
+ "epoch": 2.944,
668
+ "grad_norm": 21.298931121826172,
669
+ "learning_rate": 3.9173789173789176e-05,
670
+ "loss": 0.2124,
671
+ "step": 920
672
+ },
673
+ {
674
+ "epoch": 2.976,
675
+ "grad_norm": 10.21948528289795,
676
+ "learning_rate": 3.89957264957265e-05,
677
+ "loss": 0.1584,
678
+ "step": 930
679
+ },
680
+ {
681
+ "epoch": 2.9984,
682
+ "eval_accuracy": 0.8018779342723005,
683
+ "eval_loss": 0.560497522354126,
684
+ "eval_runtime": 14.76,
685
+ "eval_samples_per_second": 72.155,
686
+ "eval_steps_per_second": 2.304,
687
+ "step": 937
688
+ },
689
+ {
690
+ "epoch": 3.008,
691
+ "grad_norm": 9.502830505371094,
692
+ "learning_rate": 3.881766381766382e-05,
693
+ "loss": 0.1569,
694
+ "step": 940
695
+ },
696
+ {
697
+ "epoch": 3.04,
698
+ "grad_norm": 23.640422821044922,
699
+ "learning_rate": 3.863960113960114e-05,
700
+ "loss": 0.2172,
701
+ "step": 950
702
+ },
703
+ {
704
+ "epoch": 3.072,
705
+ "grad_norm": 7.1912312507629395,
706
+ "learning_rate": 3.846153846153846e-05,
707
+ "loss": 0.1954,
708
+ "step": 960
709
+ },
710
+ {
711
+ "epoch": 3.104,
712
+ "grad_norm": 61.15785598754883,
713
+ "learning_rate": 3.828347578347579e-05,
714
+ "loss": 0.2364,
715
+ "step": 970
716
+ },
717
+ {
718
+ "epoch": 3.136,
719
+ "grad_norm": 11.975686073303223,
720
+ "learning_rate": 3.8105413105413106e-05,
721
+ "loss": 0.1856,
722
+ "step": 980
723
+ },
724
+ {
725
+ "epoch": 3.168,
726
+ "grad_norm": 13.5460844039917,
727
+ "learning_rate": 3.7927350427350425e-05,
728
+ "loss": 0.1608,
729
+ "step": 990
730
+ },
731
+ {
732
+ "epoch": 3.2,
733
+ "grad_norm": 13.04128646850586,
734
+ "learning_rate": 3.774928774928775e-05,
735
+ "loss": 0.1485,
736
+ "step": 1000
737
+ },
738
+ {
739
+ "epoch": 3.232,
740
+ "grad_norm": 17.87329864501953,
741
+ "learning_rate": 3.7571225071225075e-05,
742
+ "loss": 0.1782,
743
+ "step": 1010
744
+ },
745
+ {
746
+ "epoch": 3.2640000000000002,
747
+ "grad_norm": 21.913515090942383,
748
+ "learning_rate": 3.739316239316239e-05,
749
+ "loss": 0.1636,
750
+ "step": 1020
751
+ },
752
+ {
753
+ "epoch": 3.296,
754
+ "grad_norm": 53.9290771484375,
755
+ "learning_rate": 3.721509971509972e-05,
756
+ "loss": 0.1869,
757
+ "step": 1030
758
+ },
759
+ {
760
+ "epoch": 3.328,
761
+ "grad_norm": 12.648904800415039,
762
+ "learning_rate": 3.7037037037037037e-05,
763
+ "loss": 0.1374,
764
+ "step": 1040
765
+ },
766
+ {
767
+ "epoch": 3.36,
768
+ "grad_norm": 30.469200134277344,
769
+ "learning_rate": 3.685897435897436e-05,
770
+ "loss": 0.1677,
771
+ "step": 1050
772
+ },
773
+ {
774
+ "epoch": 3.392,
775
+ "grad_norm": 53.974571228027344,
776
+ "learning_rate": 3.668091168091169e-05,
777
+ "loss": 0.1297,
778
+ "step": 1060
779
+ },
780
+ {
781
+ "epoch": 3.424,
782
+ "grad_norm": 13.038790702819824,
783
+ "learning_rate": 3.6502849002849005e-05,
784
+ "loss": 0.1773,
785
+ "step": 1070
786
+ },
787
+ {
788
+ "epoch": 3.456,
789
+ "grad_norm": 10.653145790100098,
790
+ "learning_rate": 3.6324786324786323e-05,
791
+ "loss": 0.1672,
792
+ "step": 1080
793
+ },
794
+ {
795
+ "epoch": 3.488,
796
+ "grad_norm": 19.27906608581543,
797
+ "learning_rate": 3.614672364672365e-05,
798
+ "loss": 0.1587,
799
+ "step": 1090
800
+ },
801
+ {
802
+ "epoch": 3.52,
803
+ "grad_norm": 9.866512298583984,
804
+ "learning_rate": 3.5968660968660974e-05,
805
+ "loss": 0.1566,
806
+ "step": 1100
807
+ },
808
+ {
809
+ "epoch": 3.552,
810
+ "grad_norm": 24.01140785217285,
811
+ "learning_rate": 3.579059829059829e-05,
812
+ "loss": 0.1546,
813
+ "step": 1110
814
+ },
815
+ {
816
+ "epoch": 3.584,
817
+ "grad_norm": 33.78230285644531,
818
+ "learning_rate": 3.561253561253561e-05,
819
+ "loss": 0.1769,
820
+ "step": 1120
821
+ },
822
+ {
823
+ "epoch": 3.616,
824
+ "grad_norm": 29.797826766967773,
825
+ "learning_rate": 3.5434472934472935e-05,
826
+ "loss": 0.1477,
827
+ "step": 1130
828
+ },
829
+ {
830
+ "epoch": 3.648,
831
+ "grad_norm": 11.370080947875977,
832
+ "learning_rate": 3.525641025641026e-05,
833
+ "loss": 0.1749,
834
+ "step": 1140
835
+ },
836
+ {
837
+ "epoch": 3.68,
838
+ "grad_norm": 6.187923908233643,
839
+ "learning_rate": 3.507834757834758e-05,
840
+ "loss": 0.1187,
841
+ "step": 1150
842
+ },
843
+ {
844
+ "epoch": 3.7119999999999997,
845
+ "grad_norm": 57.243919372558594,
846
+ "learning_rate": 3.4900284900284904e-05,
847
+ "loss": 0.1455,
848
+ "step": 1160
849
+ },
850
+ {
851
+ "epoch": 3.7439999999999998,
852
+ "grad_norm": 48.05120849609375,
853
+ "learning_rate": 3.472222222222222e-05,
854
+ "loss": 0.1792,
855
+ "step": 1170
856
+ },
857
+ {
858
+ "epoch": 3.776,
859
+ "grad_norm": 18.508798599243164,
860
+ "learning_rate": 3.454415954415954e-05,
861
+ "loss": 0.1977,
862
+ "step": 1180
863
+ },
864
+ {
865
+ "epoch": 3.808,
866
+ "grad_norm": 11.206692695617676,
867
+ "learning_rate": 3.436609686609687e-05,
868
+ "loss": 0.1752,
869
+ "step": 1190
870
+ },
871
+ {
872
+ "epoch": 3.84,
873
+ "grad_norm": 8.796914100646973,
874
+ "learning_rate": 3.418803418803419e-05,
875
+ "loss": 0.134,
876
+ "step": 1200
877
+ },
878
+ {
879
+ "epoch": 3.872,
880
+ "grad_norm": 28.23493003845215,
881
+ "learning_rate": 3.400997150997151e-05,
882
+ "loss": 0.1466,
883
+ "step": 1210
884
+ },
885
+ {
886
+ "epoch": 3.904,
887
+ "grad_norm": 24.789331436157227,
888
+ "learning_rate": 3.3831908831908834e-05,
889
+ "loss": 0.1674,
890
+ "step": 1220
891
+ },
892
+ {
893
+ "epoch": 3.936,
894
+ "grad_norm": 29.97929573059082,
895
+ "learning_rate": 3.365384615384616e-05,
896
+ "loss": 0.1626,
897
+ "step": 1230
898
+ },
899
+ {
900
+ "epoch": 3.968,
901
+ "grad_norm": 12.732221603393555,
902
+ "learning_rate": 3.347578347578348e-05,
903
+ "loss": 0.1162,
904
+ "step": 1240
905
+ },
906
+ {
907
+ "epoch": 4.0,
908
+ "grad_norm": 23.768869400024414,
909
+ "learning_rate": 3.32977207977208e-05,
910
+ "loss": 0.1382,
911
+ "step": 1250
912
+ },
913
+ {
914
+ "epoch": 4.0,
915
+ "eval_accuracy": 0.8497652582159625,
916
+ "eval_loss": 0.429149866104126,
917
+ "eval_runtime": 14.7304,
918
+ "eval_samples_per_second": 72.299,
919
+ "eval_steps_per_second": 2.308,
920
+ "step": 1250
921
+ },
922
+ {
923
+ "epoch": 4.032,
924
+ "grad_norm": 26.3099308013916,
925
+ "learning_rate": 3.311965811965812e-05,
926
+ "loss": 0.1279,
927
+ "step": 1260
928
+ },
929
+ {
930
+ "epoch": 4.064,
931
+ "grad_norm": 26.10764503479004,
932
+ "learning_rate": 3.294159544159544e-05,
933
+ "loss": 0.1868,
934
+ "step": 1270
935
+ },
936
+ {
937
+ "epoch": 4.096,
938
+ "grad_norm": 42.908058166503906,
939
+ "learning_rate": 3.2763532763532764e-05,
940
+ "loss": 0.1102,
941
+ "step": 1280
942
+ },
943
+ {
944
+ "epoch": 4.128,
945
+ "grad_norm": 18.12682342529297,
946
+ "learning_rate": 3.258547008547009e-05,
947
+ "loss": 0.1372,
948
+ "step": 1290
949
+ },
950
+ {
951
+ "epoch": 4.16,
952
+ "grad_norm": 32.7211799621582,
953
+ "learning_rate": 3.240740740740741e-05,
954
+ "loss": 0.1342,
955
+ "step": 1300
956
+ },
957
+ {
958
+ "epoch": 4.192,
959
+ "grad_norm": 43.664955139160156,
960
+ "learning_rate": 3.2229344729344726e-05,
961
+ "loss": 0.1205,
962
+ "step": 1310
963
+ },
964
+ {
965
+ "epoch": 4.224,
966
+ "grad_norm": 21.014827728271484,
967
+ "learning_rate": 3.205128205128206e-05,
968
+ "loss": 0.1328,
969
+ "step": 1320
970
+ },
971
+ {
972
+ "epoch": 4.256,
973
+ "grad_norm": 9.477145195007324,
974
+ "learning_rate": 3.1873219373219376e-05,
975
+ "loss": 0.12,
976
+ "step": 1330
977
+ },
978
+ {
979
+ "epoch": 4.288,
980
+ "grad_norm": 11.2942533493042,
981
+ "learning_rate": 3.1695156695156695e-05,
982
+ "loss": 0.1033,
983
+ "step": 1340
984
+ },
985
+ {
986
+ "epoch": 4.32,
987
+ "grad_norm": 19.64799690246582,
988
+ "learning_rate": 3.151709401709402e-05,
989
+ "loss": 0.1157,
990
+ "step": 1350
991
+ },
992
+ {
993
+ "epoch": 4.352,
994
+ "grad_norm": 26.556772232055664,
995
+ "learning_rate": 3.133903133903134e-05,
996
+ "loss": 0.1053,
997
+ "step": 1360
998
+ },
999
+ {
1000
+ "epoch": 4.384,
1001
+ "grad_norm": 17.614032745361328,
1002
+ "learning_rate": 3.116096866096866e-05,
1003
+ "loss": 0.1266,
1004
+ "step": 1370
1005
+ },
1006
+ {
1007
+ "epoch": 4.416,
1008
+ "grad_norm": 6.941921710968018,
1009
+ "learning_rate": 3.098290598290599e-05,
1010
+ "loss": 0.1177,
1011
+ "step": 1380
1012
+ },
1013
+ {
1014
+ "epoch": 4.448,
1015
+ "grad_norm": 12.218695640563965,
1016
+ "learning_rate": 3.080484330484331e-05,
1017
+ "loss": 0.1209,
1018
+ "step": 1390
1019
+ },
1020
+ {
1021
+ "epoch": 4.48,
1022
+ "grad_norm": 43.25284194946289,
1023
+ "learning_rate": 3.0626780626780625e-05,
1024
+ "loss": 0.1346,
1025
+ "step": 1400
1026
+ },
1027
+ {
1028
+ "epoch": 4.5120000000000005,
1029
+ "grad_norm": 13.644841194152832,
1030
+ "learning_rate": 3.0448717948717947e-05,
1031
+ "loss": 0.1156,
1032
+ "step": 1410
1033
+ },
1034
+ {
1035
+ "epoch": 4.5440000000000005,
1036
+ "grad_norm": 66.71968078613281,
1037
+ "learning_rate": 3.0270655270655275e-05,
1038
+ "loss": 0.1197,
1039
+ "step": 1420
1040
+ },
1041
+ {
1042
+ "epoch": 4.576,
1043
+ "grad_norm": 59.87635040283203,
1044
+ "learning_rate": 3.0092592592592593e-05,
1045
+ "loss": 0.1105,
1046
+ "step": 1430
1047
+ },
1048
+ {
1049
+ "epoch": 4.608,
1050
+ "grad_norm": 10.875226020812988,
1051
+ "learning_rate": 2.9914529914529915e-05,
1052
+ "loss": 0.0932,
1053
+ "step": 1440
1054
+ },
1055
+ {
1056
+ "epoch": 4.64,
1057
+ "grad_norm": 21.233959197998047,
1058
+ "learning_rate": 2.9736467236467237e-05,
1059
+ "loss": 0.0914,
1060
+ "step": 1450
1061
+ },
1062
+ {
1063
+ "epoch": 4.672,
1064
+ "grad_norm": 25.02607536315918,
1065
+ "learning_rate": 2.9558404558404562e-05,
1066
+ "loss": 0.1175,
1067
+ "step": 1460
1068
+ },
1069
+ {
1070
+ "epoch": 4.704,
1071
+ "grad_norm": 9.968689918518066,
1072
+ "learning_rate": 2.9380341880341884e-05,
1073
+ "loss": 0.1111,
1074
+ "step": 1470
1075
+ },
1076
+ {
1077
+ "epoch": 4.736,
1078
+ "grad_norm": 19.770673751831055,
1079
+ "learning_rate": 2.9202279202279202e-05,
1080
+ "loss": 0.1627,
1081
+ "step": 1480
1082
+ },
1083
+ {
1084
+ "epoch": 4.768,
1085
+ "grad_norm": 16.976701736450195,
1086
+ "learning_rate": 2.9024216524216524e-05,
1087
+ "loss": 0.1355,
1088
+ "step": 1490
1089
+ },
1090
+ {
1091
+ "epoch": 4.8,
1092
+ "grad_norm": 32.00942611694336,
1093
+ "learning_rate": 2.8846153846153845e-05,
1094
+ "loss": 0.1339,
1095
+ "step": 1500
1096
+ },
1097
+ {
1098
+ "epoch": 4.832,
1099
+ "grad_norm": 12.106451034545898,
1100
+ "learning_rate": 2.866809116809117e-05,
1101
+ "loss": 0.1192,
1102
+ "step": 1510
1103
+ },
1104
+ {
1105
+ "epoch": 4.864,
1106
+ "grad_norm": 20.953189849853516,
1107
+ "learning_rate": 2.8490028490028492e-05,
1108
+ "loss": 0.1295,
1109
+ "step": 1520
1110
+ },
1111
+ {
1112
+ "epoch": 4.896,
1113
+ "grad_norm": 18.536453247070312,
1114
+ "learning_rate": 2.8311965811965814e-05,
1115
+ "loss": 0.101,
1116
+ "step": 1530
1117
+ },
1118
+ {
1119
+ "epoch": 4.928,
1120
+ "grad_norm": 14.893009185791016,
1121
+ "learning_rate": 2.8133903133903132e-05,
1122
+ "loss": 0.112,
1123
+ "step": 1540
1124
+ },
1125
+ {
1126
+ "epoch": 4.96,
1127
+ "grad_norm": 10.388225555419922,
1128
+ "learning_rate": 2.795584045584046e-05,
1129
+ "loss": 0.1166,
1130
+ "step": 1550
1131
+ },
1132
+ {
1133
+ "epoch": 4.992,
1134
+ "grad_norm": 15.933597564697266,
1135
+ "learning_rate": 2.777777777777778e-05,
1136
+ "loss": 0.1058,
1137
+ "step": 1560
1138
+ },
1139
+ {
1140
+ "epoch": 4.9984,
1141
+ "eval_accuracy": 0.8666666666666667,
1142
+ "eval_loss": 0.39110323786735535,
1143
+ "eval_runtime": 14.8044,
1144
+ "eval_samples_per_second": 71.938,
1145
+ "eval_steps_per_second": 2.297,
1146
+ "step": 1562
1147
+ },
1148
+ {
1149
+ "epoch": 5.024,
1150
+ "grad_norm": 21.4670352935791,
1151
+ "learning_rate": 2.75997150997151e-05,
1152
+ "loss": 0.1265,
1153
+ "step": 1570
1154
+ },
1155
+ {
1156
+ "epoch": 5.056,
1157
+ "grad_norm": 22.757436752319336,
1158
+ "learning_rate": 2.7421652421652423e-05,
1159
+ "loss": 0.1,
1160
+ "step": 1580
1161
+ },
1162
+ {
1163
+ "epoch": 5.088,
1164
+ "grad_norm": 11.63531494140625,
1165
+ "learning_rate": 2.724358974358974e-05,
1166
+ "loss": 0.1181,
1167
+ "step": 1590
1168
+ },
1169
+ {
1170
+ "epoch": 5.12,
1171
+ "grad_norm": 11.879034996032715,
1172
+ "learning_rate": 2.706552706552707e-05,
1173
+ "loss": 0.1001,
1174
+ "step": 1600
1175
+ },
1176
+ {
1177
+ "epoch": 5.152,
1178
+ "grad_norm": 13.50586223602295,
1179
+ "learning_rate": 2.688746438746439e-05,
1180
+ "loss": 0.098,
1181
+ "step": 1610
1182
+ },
1183
+ {
1184
+ "epoch": 5.184,
1185
+ "grad_norm": 6.154415130615234,
1186
+ "learning_rate": 2.670940170940171e-05,
1187
+ "loss": 0.1005,
1188
+ "step": 1620
1189
+ },
1190
+ {
1191
+ "epoch": 5.216,
1192
+ "grad_norm": 8.229863166809082,
1193
+ "learning_rate": 2.653133903133903e-05,
1194
+ "loss": 0.1044,
1195
+ "step": 1630
1196
+ },
1197
+ {
1198
+ "epoch": 5.248,
1199
+ "grad_norm": 18.829721450805664,
1200
+ "learning_rate": 2.6353276353276356e-05,
1201
+ "loss": 0.1118,
1202
+ "step": 1640
1203
+ },
1204
+ {
1205
+ "epoch": 5.28,
1206
+ "grad_norm": 17.462610244750977,
1207
+ "learning_rate": 2.6175213675213678e-05,
1208
+ "loss": 0.0809,
1209
+ "step": 1650
1210
+ },
1211
+ {
1212
+ "epoch": 5.312,
1213
+ "grad_norm": 15.725268363952637,
1214
+ "learning_rate": 2.5997150997151e-05,
1215
+ "loss": 0.109,
1216
+ "step": 1660
1217
+ },
1218
+ {
1219
+ "epoch": 5.344,
1220
+ "grad_norm": 12.376166343688965,
1221
+ "learning_rate": 2.5819088319088318e-05,
1222
+ "loss": 0.0775,
1223
+ "step": 1670
1224
+ },
1225
+ {
1226
+ "epoch": 5.376,
1227
+ "grad_norm": 9.472005844116211,
1228
+ "learning_rate": 2.564102564102564e-05,
1229
+ "loss": 0.0958,
1230
+ "step": 1680
1231
+ },
1232
+ {
1233
+ "epoch": 5.408,
1234
+ "grad_norm": 16.398996353149414,
1235
+ "learning_rate": 2.5462962962962965e-05,
1236
+ "loss": 0.0818,
1237
+ "step": 1690
1238
+ },
1239
+ {
1240
+ "epoch": 5.44,
1241
+ "grad_norm": 15.111603736877441,
1242
+ "learning_rate": 2.5284900284900286e-05,
1243
+ "loss": 0.0945,
1244
+ "step": 1700
1245
+ },
1246
+ {
1247
+ "epoch": 5.4719999999999995,
1248
+ "grad_norm": 13.802181243896484,
1249
+ "learning_rate": 2.5106837606837608e-05,
1250
+ "loss": 0.0694,
1251
+ "step": 1710
1252
+ },
1253
+ {
1254
+ "epoch": 5.504,
1255
+ "grad_norm": 6.6055216789245605,
1256
+ "learning_rate": 2.492877492877493e-05,
1257
+ "loss": 0.0886,
1258
+ "step": 1720
1259
+ },
1260
+ {
1261
+ "epoch": 5.536,
1262
+ "grad_norm": 17.351755142211914,
1263
+ "learning_rate": 2.475071225071225e-05,
1264
+ "loss": 0.0939,
1265
+ "step": 1730
1266
+ },
1267
+ {
1268
+ "epoch": 5.568,
1269
+ "grad_norm": 14.629916191101074,
1270
+ "learning_rate": 2.4572649572649573e-05,
1271
+ "loss": 0.0749,
1272
+ "step": 1740
1273
+ },
1274
+ {
1275
+ "epoch": 5.6,
1276
+ "grad_norm": 13.987975120544434,
1277
+ "learning_rate": 2.4394586894586895e-05,
1278
+ "loss": 0.0696,
1279
+ "step": 1750
1280
+ },
1281
+ {
1282
+ "epoch": 5.632,
1283
+ "grad_norm": 19.911739349365234,
1284
+ "learning_rate": 2.4216524216524217e-05,
1285
+ "loss": 0.0829,
1286
+ "step": 1760
1287
+ },
1288
+ {
1289
+ "epoch": 5.664,
1290
+ "grad_norm": 11.428616523742676,
1291
+ "learning_rate": 2.4038461538461542e-05,
1292
+ "loss": 0.0816,
1293
+ "step": 1770
1294
+ },
1295
+ {
1296
+ "epoch": 5.696,
1297
+ "grad_norm": 24.872535705566406,
1298
+ "learning_rate": 2.386039886039886e-05,
1299
+ "loss": 0.0876,
1300
+ "step": 1780
1301
+ },
1302
+ {
1303
+ "epoch": 5.728,
1304
+ "grad_norm": 6.562127113342285,
1305
+ "learning_rate": 2.3682336182336185e-05,
1306
+ "loss": 0.0869,
1307
+ "step": 1790
1308
+ },
1309
+ {
1310
+ "epoch": 5.76,
1311
+ "grad_norm": 14.26649284362793,
1312
+ "learning_rate": 2.3504273504273504e-05,
1313
+ "loss": 0.1098,
1314
+ "step": 1800
1315
+ },
1316
+ {
1317
+ "epoch": 5.792,
1318
+ "grad_norm": 10.148120880126953,
1319
+ "learning_rate": 2.332621082621083e-05,
1320
+ "loss": 0.0725,
1321
+ "step": 1810
1322
+ },
1323
+ {
1324
+ "epoch": 5.824,
1325
+ "grad_norm": 31.81497573852539,
1326
+ "learning_rate": 2.314814814814815e-05,
1327
+ "loss": 0.0599,
1328
+ "step": 1820
1329
+ },
1330
+ {
1331
+ "epoch": 5.856,
1332
+ "grad_norm": 20.058345794677734,
1333
+ "learning_rate": 2.297008547008547e-05,
1334
+ "loss": 0.0987,
1335
+ "step": 1830
1336
+ },
1337
+ {
1338
+ "epoch": 5.888,
1339
+ "grad_norm": 23.388731002807617,
1340
+ "learning_rate": 2.2792022792022794e-05,
1341
+ "loss": 0.0993,
1342
+ "step": 1840
1343
+ },
1344
+ {
1345
+ "epoch": 5.92,
1346
+ "grad_norm": 12.417990684509277,
1347
+ "learning_rate": 2.2613960113960116e-05,
1348
+ "loss": 0.0997,
1349
+ "step": 1850
1350
+ },
1351
+ {
1352
+ "epoch": 5.952,
1353
+ "grad_norm": 18.029085159301758,
1354
+ "learning_rate": 2.2435897435897437e-05,
1355
+ "loss": 0.0875,
1356
+ "step": 1860
1357
+ },
1358
+ {
1359
+ "epoch": 5.984,
1360
+ "grad_norm": 8.003488540649414,
1361
+ "learning_rate": 2.225783475783476e-05,
1362
+ "loss": 0.0703,
1363
+ "step": 1870
1364
+ },
1365
+ {
1366
+ "epoch": 6.0,
1367
+ "eval_accuracy": 0.8028169014084507,
1368
+ "eval_loss": 0.8593423366546631,
1369
+ "eval_runtime": 26.4195,
1370
+ "eval_samples_per_second": 40.311,
1371
+ "eval_steps_per_second": 1.287,
1372
+ "step": 1875
1373
+ },
1374
+ {
1375
+ "epoch": 6.016,
1376
+ "grad_norm": 13.387948036193848,
1377
+ "learning_rate": 2.207977207977208e-05,
1378
+ "loss": 0.0723,
1379
+ "step": 1880
1380
+ },
1381
+ {
1382
+ "epoch": 6.048,
1383
+ "grad_norm": 3.795748472213745,
1384
+ "learning_rate": 2.1901709401709402e-05,
1385
+ "loss": 0.067,
1386
+ "step": 1890
1387
+ },
1388
+ {
1389
+ "epoch": 6.08,
1390
+ "grad_norm": 15.893839836120605,
1391
+ "learning_rate": 2.1723646723646724e-05,
1392
+ "loss": 0.0718,
1393
+ "step": 1900
1394
+ },
1395
+ {
1396
+ "epoch": 6.112,
1397
+ "grad_norm": 6.81917142868042,
1398
+ "learning_rate": 2.1545584045584046e-05,
1399
+ "loss": 0.0738,
1400
+ "step": 1910
1401
+ },
1402
+ {
1403
+ "epoch": 6.144,
1404
+ "grad_norm": 16.609046936035156,
1405
+ "learning_rate": 2.1367521367521368e-05,
1406
+ "loss": 0.0698,
1407
+ "step": 1920
1408
+ },
1409
+ {
1410
+ "epoch": 6.176,
1411
+ "grad_norm": 21.796428680419922,
1412
+ "learning_rate": 2.1189458689458693e-05,
1413
+ "loss": 0.0681,
1414
+ "step": 1930
1415
+ },
1416
+ {
1417
+ "epoch": 6.208,
1418
+ "grad_norm": 9.367144584655762,
1419
+ "learning_rate": 2.101139601139601e-05,
1420
+ "loss": 0.0776,
1421
+ "step": 1940
1422
+ },
1423
+ {
1424
+ "epoch": 6.24,
1425
+ "grad_norm": 45.68296432495117,
1426
+ "learning_rate": 2.0833333333333336e-05,
1427
+ "loss": 0.075,
1428
+ "step": 1950
1429
+ },
1430
+ {
1431
+ "epoch": 6.272,
1432
+ "grad_norm": 22.519546508789062,
1433
+ "learning_rate": 2.0655270655270654e-05,
1434
+ "loss": 0.1017,
1435
+ "step": 1960
1436
+ },
1437
+ {
1438
+ "epoch": 6.304,
1439
+ "grad_norm": 26.589696884155273,
1440
+ "learning_rate": 2.047720797720798e-05,
1441
+ "loss": 0.0699,
1442
+ "step": 1970
1443
+ },
1444
+ {
1445
+ "epoch": 6.336,
1446
+ "grad_norm": 10.323519706726074,
1447
+ "learning_rate": 2.02991452991453e-05,
1448
+ "loss": 0.0628,
1449
+ "step": 1980
1450
+ },
1451
+ {
1452
+ "epoch": 6.368,
1453
+ "grad_norm": 19.11711311340332,
1454
+ "learning_rate": 2.012108262108262e-05,
1455
+ "loss": 0.0812,
1456
+ "step": 1990
1457
+ },
1458
+ {
1459
+ "epoch": 6.4,
1460
+ "grad_norm": 33.67527770996094,
1461
+ "learning_rate": 1.9943019943019945e-05,
1462
+ "loss": 0.085,
1463
+ "step": 2000
1464
+ },
1465
+ {
1466
+ "epoch": 6.432,
1467
+ "grad_norm": 11.990752220153809,
1468
+ "learning_rate": 1.9764957264957266e-05,
1469
+ "loss": 0.0702,
1470
+ "step": 2010
1471
+ },
1472
+ {
1473
+ "epoch": 6.464,
1474
+ "grad_norm": 30.36664581298828,
1475
+ "learning_rate": 1.9586894586894588e-05,
1476
+ "loss": 0.0628,
1477
+ "step": 2020
1478
+ },
1479
+ {
1480
+ "epoch": 6.496,
1481
+ "grad_norm": 21.322391510009766,
1482
+ "learning_rate": 1.940883190883191e-05,
1483
+ "loss": 0.0719,
1484
+ "step": 2030
1485
+ },
1486
+ {
1487
+ "epoch": 6.5280000000000005,
1488
+ "grad_norm": 29.107898712158203,
1489
+ "learning_rate": 1.923076923076923e-05,
1490
+ "loss": 0.0525,
1491
+ "step": 2040
1492
+ },
1493
+ {
1494
+ "epoch": 6.5600000000000005,
1495
+ "grad_norm": 17.11159324645996,
1496
+ "learning_rate": 1.9052706552706553e-05,
1497
+ "loss": 0.0687,
1498
+ "step": 2050
1499
+ },
1500
+ {
1501
+ "epoch": 6.592,
1502
+ "grad_norm": 12.946314811706543,
1503
+ "learning_rate": 1.8874643874643875e-05,
1504
+ "loss": 0.0867,
1505
+ "step": 2060
1506
+ },
1507
+ {
1508
+ "epoch": 6.624,
1509
+ "grad_norm": 7.321907997131348,
1510
+ "learning_rate": 1.8696581196581197e-05,
1511
+ "loss": 0.0661,
1512
+ "step": 2070
1513
+ },
1514
+ {
1515
+ "epoch": 6.656,
1516
+ "grad_norm": 34.47880172729492,
1517
+ "learning_rate": 1.8518518518518518e-05,
1518
+ "loss": 0.0594,
1519
+ "step": 2080
1520
+ },
1521
+ {
1522
+ "epoch": 6.688,
1523
+ "grad_norm": 10.502516746520996,
1524
+ "learning_rate": 1.8340455840455843e-05,
1525
+ "loss": 0.0545,
1526
+ "step": 2090
1527
+ },
1528
+ {
1529
+ "epoch": 6.72,
1530
+ "grad_norm": 24.073810577392578,
1531
+ "learning_rate": 1.8162393162393162e-05,
1532
+ "loss": 0.0592,
1533
+ "step": 2100
1534
+ },
1535
+ {
1536
+ "epoch": 6.752,
1537
+ "grad_norm": 25.363536834716797,
1538
+ "learning_rate": 1.7984330484330487e-05,
1539
+ "loss": 0.0767,
1540
+ "step": 2110
1541
+ },
1542
+ {
1543
+ "epoch": 6.784,
1544
+ "grad_norm": 15.268250465393066,
1545
+ "learning_rate": 1.7806267806267805e-05,
1546
+ "loss": 0.069,
1547
+ "step": 2120
1548
+ },
1549
+ {
1550
+ "epoch": 6.816,
1551
+ "grad_norm": 8.48227596282959,
1552
+ "learning_rate": 1.762820512820513e-05,
1553
+ "loss": 0.0625,
1554
+ "step": 2130
1555
+ },
1556
+ {
1557
+ "epoch": 6.848,
1558
+ "grad_norm": 10.944808006286621,
1559
+ "learning_rate": 1.7450142450142452e-05,
1560
+ "loss": 0.0729,
1561
+ "step": 2140
1562
+ },
1563
+ {
1564
+ "epoch": 6.88,
1565
+ "grad_norm": 12.689846992492676,
1566
+ "learning_rate": 1.727207977207977e-05,
1567
+ "loss": 0.0634,
1568
+ "step": 2150
1569
+ },
1570
+ {
1571
+ "epoch": 6.912,
1572
+ "grad_norm": 6.600920677185059,
1573
+ "learning_rate": 1.7094017094017095e-05,
1574
+ "loss": 0.0758,
1575
+ "step": 2160
1576
+ },
1577
+ {
1578
+ "epoch": 6.944,
1579
+ "grad_norm": 19.84024429321289,
1580
+ "learning_rate": 1.6915954415954417e-05,
1581
+ "loss": 0.0565,
1582
+ "step": 2170
1583
+ },
1584
+ {
1585
+ "epoch": 6.976,
1586
+ "grad_norm": 8.632323265075684,
1587
+ "learning_rate": 1.673789173789174e-05,
1588
+ "loss": 0.0671,
1589
+ "step": 2180
1590
+ },
1591
+ {
1592
+ "epoch": 6.9984,
1593
+ "eval_accuracy": 0.8892018779342723,
1594
+ "eval_loss": 0.3630879521369934,
1595
+ "eval_runtime": 14.8354,
1596
+ "eval_samples_per_second": 71.788,
1597
+ "eval_steps_per_second": 2.292,
1598
+ "step": 2187
1599
+ },
1600
+ {
1601
+ "epoch": 7.008,
1602
+ "grad_norm": 7.642409324645996,
1603
+ "learning_rate": 1.655982905982906e-05,
1604
+ "loss": 0.0489,
1605
+ "step": 2190
1606
+ },
1607
+ {
1608
+ "epoch": 7.04,
1609
+ "grad_norm": 8.677549362182617,
1610
+ "learning_rate": 1.6381766381766382e-05,
1611
+ "loss": 0.0596,
1612
+ "step": 2200
1613
+ },
1614
+ {
1615
+ "epoch": 7.072,
1616
+ "grad_norm": 20.24688148498535,
1617
+ "learning_rate": 1.6203703703703704e-05,
1618
+ "loss": 0.0546,
1619
+ "step": 2210
1620
+ },
1621
+ {
1622
+ "epoch": 7.104,
1623
+ "grad_norm": 19.509143829345703,
1624
+ "learning_rate": 1.602564102564103e-05,
1625
+ "loss": 0.0581,
1626
+ "step": 2220
1627
+ },
1628
+ {
1629
+ "epoch": 7.136,
1630
+ "grad_norm": 26.461017608642578,
1631
+ "learning_rate": 1.5847578347578347e-05,
1632
+ "loss": 0.0857,
1633
+ "step": 2230
1634
+ },
1635
+ {
1636
+ "epoch": 7.168,
1637
+ "grad_norm": 14.442258834838867,
1638
+ "learning_rate": 1.566951566951567e-05,
1639
+ "loss": 0.0688,
1640
+ "step": 2240
1641
+ },
1642
+ {
1643
+ "epoch": 7.2,
1644
+ "grad_norm": 8.91716194152832,
1645
+ "learning_rate": 1.5491452991452994e-05,
1646
+ "loss": 0.079,
1647
+ "step": 2250
1648
+ },
1649
+ {
1650
+ "epoch": 7.232,
1651
+ "grad_norm": 15.394043922424316,
1652
+ "learning_rate": 1.5313390313390312e-05,
1653
+ "loss": 0.0561,
1654
+ "step": 2260
1655
+ },
1656
+ {
1657
+ "epoch": 7.264,
1658
+ "grad_norm": 19.662343978881836,
1659
+ "learning_rate": 1.5135327635327638e-05,
1660
+ "loss": 0.0581,
1661
+ "step": 2270
1662
+ },
1663
+ {
1664
+ "epoch": 7.296,
1665
+ "grad_norm": 8.76469898223877,
1666
+ "learning_rate": 1.4957264957264958e-05,
1667
+ "loss": 0.0439,
1668
+ "step": 2280
1669
+ },
1670
+ {
1671
+ "epoch": 7.328,
1672
+ "grad_norm": 10.02730941772461,
1673
+ "learning_rate": 1.4779202279202281e-05,
1674
+ "loss": 0.0547,
1675
+ "step": 2290
1676
+ },
1677
+ {
1678
+ "epoch": 7.36,
1679
+ "grad_norm": 21.142120361328125,
1680
+ "learning_rate": 1.4601139601139601e-05,
1681
+ "loss": 0.0618,
1682
+ "step": 2300
1683
+ },
1684
+ {
1685
+ "epoch": 7.392,
1686
+ "grad_norm": 8.849529266357422,
1687
+ "learning_rate": 1.4423076923076923e-05,
1688
+ "loss": 0.0601,
1689
+ "step": 2310
1690
+ },
1691
+ {
1692
+ "epoch": 7.424,
1693
+ "grad_norm": 13.313969612121582,
1694
+ "learning_rate": 1.4245014245014246e-05,
1695
+ "loss": 0.0578,
1696
+ "step": 2320
1697
+ },
1698
+ {
1699
+ "epoch": 7.456,
1700
+ "grad_norm": 17.729047775268555,
1701
+ "learning_rate": 1.4066951566951566e-05,
1702
+ "loss": 0.0593,
1703
+ "step": 2330
1704
+ },
1705
+ {
1706
+ "epoch": 7.4879999999999995,
1707
+ "grad_norm": 13.839546203613281,
1708
+ "learning_rate": 1.388888888888889e-05,
1709
+ "loss": 0.0566,
1710
+ "step": 2340
1711
+ },
1712
+ {
1713
+ "epoch": 7.52,
1714
+ "grad_norm": 6.299365997314453,
1715
+ "learning_rate": 1.3710826210826211e-05,
1716
+ "loss": 0.0469,
1717
+ "step": 2350
1718
+ },
1719
+ {
1720
+ "epoch": 7.552,
1721
+ "grad_norm": 12.264620780944824,
1722
+ "learning_rate": 1.3532763532763535e-05,
1723
+ "loss": 0.0476,
1724
+ "step": 2360
1725
+ },
1726
+ {
1727
+ "epoch": 7.584,
1728
+ "grad_norm": 13.175020217895508,
1729
+ "learning_rate": 1.3354700854700855e-05,
1730
+ "loss": 0.0483,
1731
+ "step": 2370
1732
+ },
1733
+ {
1734
+ "epoch": 7.616,
1735
+ "grad_norm": 8.695143699645996,
1736
+ "learning_rate": 1.3176638176638178e-05,
1737
+ "loss": 0.0676,
1738
+ "step": 2380
1739
+ },
1740
+ {
1741
+ "epoch": 7.648,
1742
+ "grad_norm": 7.434255599975586,
1743
+ "learning_rate": 1.29985754985755e-05,
1744
+ "loss": 0.0514,
1745
+ "step": 2390
1746
+ },
1747
+ {
1748
+ "epoch": 7.68,
1749
+ "grad_norm": 11.564408302307129,
1750
+ "learning_rate": 1.282051282051282e-05,
1751
+ "loss": 0.0496,
1752
+ "step": 2400
1753
+ },
1754
+ {
1755
+ "epoch": 7.712,
1756
+ "grad_norm": 15.317635536193848,
1757
+ "learning_rate": 1.2642450142450143e-05,
1758
+ "loss": 0.0723,
1759
+ "step": 2410
1760
+ },
1761
+ {
1762
+ "epoch": 7.744,
1763
+ "grad_norm": 5.177554130554199,
1764
+ "learning_rate": 1.2464387464387465e-05,
1765
+ "loss": 0.0508,
1766
+ "step": 2420
1767
+ },
1768
+ {
1769
+ "epoch": 7.776,
1770
+ "grad_norm": 21.86553382873535,
1771
+ "learning_rate": 1.2286324786324787e-05,
1772
+ "loss": 0.0531,
1773
+ "step": 2430
1774
+ },
1775
+ {
1776
+ "epoch": 7.808,
1777
+ "grad_norm": 5.066462516784668,
1778
+ "learning_rate": 1.2108262108262108e-05,
1779
+ "loss": 0.0601,
1780
+ "step": 2440
1781
+ },
1782
+ {
1783
+ "epoch": 7.84,
1784
+ "grad_norm": 3.860797643661499,
1785
+ "learning_rate": 1.193019943019943e-05,
1786
+ "loss": 0.0529,
1787
+ "step": 2450
1788
+ },
1789
+ {
1790
+ "epoch": 7.872,
1791
+ "grad_norm": 27.842437744140625,
1792
+ "learning_rate": 1.1752136752136752e-05,
1793
+ "loss": 0.0736,
1794
+ "step": 2460
1795
+ },
1796
+ {
1797
+ "epoch": 7.904,
1798
+ "grad_norm": 26.736413955688477,
1799
+ "learning_rate": 1.1574074074074075e-05,
1800
+ "loss": 0.0589,
1801
+ "step": 2470
1802
+ },
1803
+ {
1804
+ "epoch": 7.936,
1805
+ "grad_norm": 25.008304595947266,
1806
+ "learning_rate": 1.1396011396011397e-05,
1807
+ "loss": 0.0549,
1808
+ "step": 2480
1809
+ },
1810
+ {
1811
+ "epoch": 7.968,
1812
+ "grad_norm": 4.6437506675720215,
1813
+ "learning_rate": 1.1217948717948719e-05,
1814
+ "loss": 0.0533,
1815
+ "step": 2490
1816
+ },
1817
+ {
1818
+ "epoch": 8.0,
1819
+ "grad_norm": 26.05324935913086,
1820
+ "learning_rate": 1.103988603988604e-05,
1821
+ "loss": 0.0457,
1822
+ "step": 2500
1823
+ },
1824
+ {
1825
+ "epoch": 8.0,
1826
+ "eval_accuracy": 0.8338028169014085,
1827
+ "eval_loss": 0.7270519137382507,
1828
+ "eval_runtime": 16.0951,
1829
+ "eval_samples_per_second": 66.169,
1830
+ "eval_steps_per_second": 2.112,
1831
+ "step": 2500
1832
+ },
1833
+ {
1834
+ "epoch": 8.032,
1835
+ "grad_norm": 6.876763820648193,
1836
+ "learning_rate": 1.0861823361823362e-05,
1837
+ "loss": 0.0392,
1838
+ "step": 2510
1839
+ },
1840
+ {
1841
+ "epoch": 8.064,
1842
+ "grad_norm": 9.069485664367676,
1843
+ "learning_rate": 1.0683760683760684e-05,
1844
+ "loss": 0.059,
1845
+ "step": 2520
1846
+ },
1847
+ {
1848
+ "epoch": 8.096,
1849
+ "grad_norm": 14.368523597717285,
1850
+ "learning_rate": 1.0505698005698005e-05,
1851
+ "loss": 0.0455,
1852
+ "step": 2530
1853
+ },
1854
+ {
1855
+ "epoch": 8.128,
1856
+ "grad_norm": 10.847954750061035,
1857
+ "learning_rate": 1.0327635327635327e-05,
1858
+ "loss": 0.0368,
1859
+ "step": 2540
1860
+ },
1861
+ {
1862
+ "epoch": 8.16,
1863
+ "grad_norm": 7.805965900421143,
1864
+ "learning_rate": 1.014957264957265e-05,
1865
+ "loss": 0.0374,
1866
+ "step": 2550
1867
+ },
1868
+ {
1869
+ "epoch": 8.192,
1870
+ "grad_norm": 37.152957916259766,
1871
+ "learning_rate": 9.971509971509972e-06,
1872
+ "loss": 0.0434,
1873
+ "step": 2560
1874
+ },
1875
+ {
1876
+ "epoch": 8.224,
1877
+ "grad_norm": 32.26515197753906,
1878
+ "learning_rate": 9.793447293447294e-06,
1879
+ "loss": 0.0538,
1880
+ "step": 2570
1881
+ },
1882
+ {
1883
+ "epoch": 8.256,
1884
+ "grad_norm": 9.871603012084961,
1885
+ "learning_rate": 9.615384615384616e-06,
1886
+ "loss": 0.0443,
1887
+ "step": 2580
1888
+ },
1889
+ {
1890
+ "epoch": 8.288,
1891
+ "grad_norm": 18.939363479614258,
1892
+ "learning_rate": 9.437321937321937e-06,
1893
+ "loss": 0.0444,
1894
+ "step": 2590
1895
+ },
1896
+ {
1897
+ "epoch": 8.32,
1898
+ "grad_norm": 8.318560600280762,
1899
+ "learning_rate": 9.259259259259259e-06,
1900
+ "loss": 0.0367,
1901
+ "step": 2600
1902
+ },
1903
+ {
1904
+ "epoch": 8.352,
1905
+ "grad_norm": 10.871614456176758,
1906
+ "learning_rate": 9.081196581196581e-06,
1907
+ "loss": 0.0493,
1908
+ "step": 2610
1909
+ },
1910
+ {
1911
+ "epoch": 8.384,
1912
+ "grad_norm": 5.46032190322876,
1913
+ "learning_rate": 8.903133903133903e-06,
1914
+ "loss": 0.0413,
1915
+ "step": 2620
1916
+ },
1917
+ {
1918
+ "epoch": 8.416,
1919
+ "grad_norm": 1.5709623098373413,
1920
+ "learning_rate": 8.725071225071226e-06,
1921
+ "loss": 0.0658,
1922
+ "step": 2630
1923
+ },
1924
+ {
1925
+ "epoch": 8.448,
1926
+ "grad_norm": 26.417600631713867,
1927
+ "learning_rate": 8.547008547008548e-06,
1928
+ "loss": 0.0681,
1929
+ "step": 2640
1930
+ },
1931
+ {
1932
+ "epoch": 8.48,
1933
+ "grad_norm": 9.628387451171875,
1934
+ "learning_rate": 8.36894586894587e-06,
1935
+ "loss": 0.0399,
1936
+ "step": 2650
1937
+ },
1938
+ {
1939
+ "epoch": 8.512,
1940
+ "grad_norm": 2.8144702911376953,
1941
+ "learning_rate": 8.190883190883191e-06,
1942
+ "loss": 0.0435,
1943
+ "step": 2660
1944
+ },
1945
+ {
1946
+ "epoch": 8.544,
1947
+ "grad_norm": 13.873322486877441,
1948
+ "learning_rate": 8.012820512820515e-06,
1949
+ "loss": 0.0501,
1950
+ "step": 2670
1951
+ },
1952
+ {
1953
+ "epoch": 8.576,
1954
+ "grad_norm": 4.96248722076416,
1955
+ "learning_rate": 7.834757834757835e-06,
1956
+ "loss": 0.0423,
1957
+ "step": 2680
1958
+ },
1959
+ {
1960
+ "epoch": 8.608,
1961
+ "grad_norm": 12.236059188842773,
1962
+ "learning_rate": 7.656695156695156e-06,
1963
+ "loss": 0.0488,
1964
+ "step": 2690
1965
+ },
1966
+ {
1967
+ "epoch": 8.64,
1968
+ "grad_norm": 6.332564353942871,
1969
+ "learning_rate": 7.478632478632479e-06,
1970
+ "loss": 0.0437,
1971
+ "step": 2700
1972
+ },
1973
+ {
1974
+ "epoch": 8.672,
1975
+ "grad_norm": 7.448380470275879,
1976
+ "learning_rate": 7.3005698005698005e-06,
1977
+ "loss": 0.035,
1978
+ "step": 2710
1979
+ },
1980
+ {
1981
+ "epoch": 8.704,
1982
+ "grad_norm": 7.228295803070068,
1983
+ "learning_rate": 7.122507122507123e-06,
1984
+ "loss": 0.0457,
1985
+ "step": 2720
1986
+ },
1987
+ {
1988
+ "epoch": 8.736,
1989
+ "grad_norm": 5.306349277496338,
1990
+ "learning_rate": 6.944444444444445e-06,
1991
+ "loss": 0.0496,
1992
+ "step": 2730
1993
+ },
1994
+ {
1995
+ "epoch": 8.768,
1996
+ "grad_norm": 15.959365844726562,
1997
+ "learning_rate": 6.766381766381767e-06,
1998
+ "loss": 0.04,
1999
+ "step": 2740
2000
+ },
2001
+ {
2002
+ "epoch": 8.8,
2003
+ "grad_norm": 13.594867706298828,
2004
+ "learning_rate": 6.588319088319089e-06,
2005
+ "loss": 0.0466,
2006
+ "step": 2750
2007
+ },
2008
+ {
2009
+ "epoch": 8.832,
2010
+ "grad_norm": 7.143582820892334,
2011
+ "learning_rate": 6.41025641025641e-06,
2012
+ "loss": 0.0367,
2013
+ "step": 2760
2014
+ },
2015
+ {
2016
+ "epoch": 8.864,
2017
+ "grad_norm": 10.158649444580078,
2018
+ "learning_rate": 6.2321937321937325e-06,
2019
+ "loss": 0.0442,
2020
+ "step": 2770
2021
+ },
2022
+ {
2023
+ "epoch": 8.896,
2024
+ "grad_norm": 4.369603633880615,
2025
+ "learning_rate": 6.054131054131054e-06,
2026
+ "loss": 0.0352,
2027
+ "step": 2780
2028
+ },
2029
+ {
2030
+ "epoch": 8.928,
2031
+ "grad_norm": 11.626177787780762,
2032
+ "learning_rate": 5.876068376068376e-06,
2033
+ "loss": 0.0414,
2034
+ "step": 2790
2035
+ },
2036
+ {
2037
+ "epoch": 8.96,
2038
+ "grad_norm": 8.576985359191895,
2039
+ "learning_rate": 5.6980056980056985e-06,
2040
+ "loss": 0.0445,
2041
+ "step": 2800
2042
+ },
2043
+ {
2044
+ "epoch": 8.992,
2045
+ "grad_norm": 11.782782554626465,
2046
+ "learning_rate": 5.51994301994302e-06,
2047
+ "loss": 0.0396,
2048
+ "step": 2810
2049
+ },
2050
+ {
2051
+ "epoch": 8.9984,
2052
+ "eval_accuracy": 0.8826291079812206,
2053
+ "eval_loss": 0.46553394198417664,
2054
+ "eval_runtime": 16.3312,
2055
+ "eval_samples_per_second": 65.213,
2056
+ "eval_steps_per_second": 2.082,
2057
+ "step": 2812
2058
+ },
2059
+ {
2060
+ "epoch": 9.024,
2061
+ "grad_norm": 9.517723083496094,
2062
+ "learning_rate": 5.341880341880342e-06,
2063
+ "loss": 0.0281,
2064
+ "step": 2820
2065
+ },
2066
+ {
2067
+ "epoch": 9.056,
2068
+ "grad_norm": 11.273005485534668,
2069
+ "learning_rate": 5.163817663817664e-06,
2070
+ "loss": 0.0395,
2071
+ "step": 2830
2072
+ },
2073
+ {
2074
+ "epoch": 9.088,
2075
+ "grad_norm": 12.864770889282227,
2076
+ "learning_rate": 4.985754985754986e-06,
2077
+ "loss": 0.0399,
2078
+ "step": 2840
2079
+ },
2080
+ {
2081
+ "epoch": 9.12,
2082
+ "grad_norm": 11.933670997619629,
2083
+ "learning_rate": 4.807692307692308e-06,
2084
+ "loss": 0.0381,
2085
+ "step": 2850
2086
+ },
2087
+ {
2088
+ "epoch": 9.152,
2089
+ "grad_norm": 21.274028778076172,
2090
+ "learning_rate": 4.6296296296296296e-06,
2091
+ "loss": 0.03,
2092
+ "step": 2860
2093
+ },
2094
+ {
2095
+ "epoch": 9.184,
2096
+ "grad_norm": 6.597903251647949,
2097
+ "learning_rate": 4.451566951566951e-06,
2098
+ "loss": 0.042,
2099
+ "step": 2870
2100
+ },
2101
+ {
2102
+ "epoch": 9.216,
2103
+ "grad_norm": 1.0557407140731812,
2104
+ "learning_rate": 4.273504273504274e-06,
2105
+ "loss": 0.0268,
2106
+ "step": 2880
2107
+ },
2108
+ {
2109
+ "epoch": 9.248,
2110
+ "grad_norm": 11.865984916687012,
2111
+ "learning_rate": 4.0954415954415956e-06,
2112
+ "loss": 0.0301,
2113
+ "step": 2890
2114
+ },
2115
+ {
2116
+ "epoch": 9.28,
2117
+ "grad_norm": 25.20126724243164,
2118
+ "learning_rate": 3.917378917378917e-06,
2119
+ "loss": 0.0383,
2120
+ "step": 2900
2121
+ },
2122
+ {
2123
+ "epoch": 9.312,
2124
+ "grad_norm": 11.134941101074219,
2125
+ "learning_rate": 3.7393162393162394e-06,
2126
+ "loss": 0.0375,
2127
+ "step": 2910
2128
+ },
2129
+ {
2130
+ "epoch": 9.344,
2131
+ "grad_norm": 2.754523992538452,
2132
+ "learning_rate": 3.5612535612535615e-06,
2133
+ "loss": 0.043,
2134
+ "step": 2920
2135
+ },
2136
+ {
2137
+ "epoch": 9.376,
2138
+ "grad_norm": 12.286087989807129,
2139
+ "learning_rate": 3.3831908831908837e-06,
2140
+ "loss": 0.035,
2141
+ "step": 2930
2142
+ },
2143
+ {
2144
+ "epoch": 9.408,
2145
+ "grad_norm": 3.8736507892608643,
2146
+ "learning_rate": 3.205128205128205e-06,
2147
+ "loss": 0.0391,
2148
+ "step": 2940
2149
+ },
2150
+ {
2151
+ "epoch": 9.44,
2152
+ "grad_norm": 22.06711769104004,
2153
+ "learning_rate": 3.027065527065527e-06,
2154
+ "loss": 0.0377,
2155
+ "step": 2950
2156
+ },
2157
+ {
2158
+ "epoch": 9.472,
2159
+ "grad_norm": 8.435704231262207,
2160
+ "learning_rate": 2.8490028490028492e-06,
2161
+ "loss": 0.0301,
2162
+ "step": 2960
2163
+ },
2164
+ {
2165
+ "epoch": 9.504,
2166
+ "grad_norm": 27.129562377929688,
2167
+ "learning_rate": 2.670940170940171e-06,
2168
+ "loss": 0.039,
2169
+ "step": 2970
2170
+ },
2171
+ {
2172
+ "epoch": 9.536,
2173
+ "grad_norm": 4.00566291809082,
2174
+ "learning_rate": 2.492877492877493e-06,
2175
+ "loss": 0.0261,
2176
+ "step": 2980
2177
+ },
2178
+ {
2179
+ "epoch": 9.568,
2180
+ "grad_norm": 18.572261810302734,
2181
+ "learning_rate": 2.3148148148148148e-06,
2182
+ "loss": 0.036,
2183
+ "step": 2990
2184
+ },
2185
+ {
2186
+ "epoch": 9.6,
2187
+ "grad_norm": 6.907130241394043,
2188
+ "learning_rate": 2.136752136752137e-06,
2189
+ "loss": 0.0455,
2190
+ "step": 3000
2191
+ },
2192
+ {
2193
+ "epoch": 9.632,
2194
+ "grad_norm": 3.8767590522766113,
2195
+ "learning_rate": 1.9586894586894586e-06,
2196
+ "loss": 0.0456,
2197
+ "step": 3010
2198
+ },
2199
+ {
2200
+ "epoch": 9.664,
2201
+ "grad_norm": 12.119085311889648,
2202
+ "learning_rate": 1.7806267806267808e-06,
2203
+ "loss": 0.037,
2204
+ "step": 3020
2205
+ },
2206
+ {
2207
+ "epoch": 9.696,
2208
+ "grad_norm": 7.650444507598877,
2209
+ "learning_rate": 1.6025641025641025e-06,
2210
+ "loss": 0.0363,
2211
+ "step": 3030
2212
+ },
2213
+ {
2214
+ "epoch": 9.728,
2215
+ "grad_norm": 8.467825889587402,
2216
+ "learning_rate": 1.4245014245014246e-06,
2217
+ "loss": 0.0198,
2218
+ "step": 3040
2219
+ },
2220
+ {
2221
+ "epoch": 9.76,
2222
+ "grad_norm": 19.763225555419922,
2223
+ "learning_rate": 1.2464387464387465e-06,
2224
+ "loss": 0.0408,
2225
+ "step": 3050
2226
+ },
2227
+ {
2228
+ "epoch": 9.792,
2229
+ "grad_norm": 25.160770416259766,
2230
+ "learning_rate": 1.0683760683760685e-06,
2231
+ "loss": 0.0392,
2232
+ "step": 3060
2233
+ },
2234
+ {
2235
+ "epoch": 9.824,
2236
+ "grad_norm": 7.482017517089844,
2237
+ "learning_rate": 8.903133903133904e-07,
2238
+ "loss": 0.0283,
2239
+ "step": 3070
2240
+ },
2241
+ {
2242
+ "epoch": 9.856,
2243
+ "grad_norm": 7.559390544891357,
2244
+ "learning_rate": 7.122507122507123e-07,
2245
+ "loss": 0.0463,
2246
+ "step": 3080
2247
+ },
2248
+ {
2249
+ "epoch": 9.888,
2250
+ "grad_norm": 10.798991203308105,
2251
+ "learning_rate": 5.341880341880342e-07,
2252
+ "loss": 0.0298,
2253
+ "step": 3090
2254
+ },
2255
+ {
2256
+ "epoch": 9.92,
2257
+ "grad_norm": 3.9386239051818848,
2258
+ "learning_rate": 3.5612535612535615e-07,
2259
+ "loss": 0.0386,
2260
+ "step": 3100
2261
+ },
2262
+ {
2263
+ "epoch": 9.952,
2264
+ "grad_norm": 15.766891479492188,
2265
+ "learning_rate": 1.7806267806267808e-07,
2266
+ "loss": 0.0277,
2267
+ "step": 3110
2268
+ },
2269
+ {
2270
+ "epoch": 9.984,
2271
+ "grad_norm": 3.3344645500183105,
2272
+ "learning_rate": 0.0,
2273
+ "loss": 0.0293,
2274
+ "step": 3120
2275
+ },
2276
+ {
2277
+ "epoch": 9.984,
2278
+ "eval_accuracy": 0.8610328638497653,
2279
+ "eval_loss": 0.7580540180206299,
2280
+ "eval_runtime": 16.5093,
2281
+ "eval_samples_per_second": 64.509,
2282
+ "eval_steps_per_second": 2.059,
2283
+ "step": 3120
2284
+ },
2285
+ {
2286
+ "epoch": 9.984,
2287
+ "step": 3120,
2288
+ "total_flos": 9.926487761391452e+18,
2289
+ "train_loss": 0.15716634974934351,
2290
+ "train_runtime": 142469.0271,
2291
+ "train_samples_per_second": 2.808,
2292
+ "train_steps_per_second": 0.022
2293
+ }
2294
+ ],
2295
+ "logging_steps": 10,
2296
+ "max_steps": 3120,
2297
+ "num_input_tokens_seen": 0,
2298
+ "num_train_epochs": 10,
2299
+ "save_steps": 500,
2300
+ "stateful_callbacks": {
2301
+ "TrainerControl": {
2302
+ "args": {
2303
+ "should_epoch_stop": false,
2304
+ "should_evaluate": false,
2305
+ "should_log": false,
2306
+ "should_save": true,
2307
+ "should_training_stop": true
2308
+ },
2309
+ "attributes": {}
2310
+ }
2311
+ },
2312
+ "total_flos": 9.926487761391452e+18,
2313
+ "train_batch_size": 32,
2314
+ "trial_name": null,
2315
+ "trial_params": null
2316
+ }