Sarah Tariq commited on
Commit
c62a506
1 Parent(s): 2ad0c9d

End of training

Browse files
README.md ADDED
@@ -0,0 +1,646 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ library_name: peft
4
+ tags:
5
+ - generated_from_trainer
6
+ base_model: google/flan-t5-large
7
+ model-index:
8
+ - name: t5-large-dialogue-classification
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # t5-large-dialogue-classification
16
+
17
+ This model is a fine-tuned version of [google/flan-t5-large](https://huggingface.co/google/flan-t5-large) on an unknown dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.2878
20
+ - Classification Report: precision recall f1-score support
21
+
22
+ ALLERGY 1.0000 0.2500 0.4000 4
23
+ ASSESSMENT 0.0000 0.0000 0.0000 4
24
+ CC 0.3750 0.7500 0.5000 4
25
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
26
+ DISPOSITION 0.0000 0.0000 0.0000 2
27
+ EDCOURSE 0.0000 0.0000 0.0000 3
28
+ EVALUATION 0.0000 0.0000 0.0000 0
29
+ EXAM 0.3333 1.0000 0.5000 1
30
+ FAM/SOCHX 0.8462 1.0000 0.9167 22
31
+ GENHX 0.8500 0.8500 0.8500 20
32
+ GYNHX 0.0000 0.0000 0.0000 1
33
+ IMAGING 0.0000 0.0000 0.0000 1
34
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
35
+ LABS 0.0000 0.0000 0.0000 1
36
+ MEDICATIONS 0.7778 1.0000 0.8750 7
37
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
38
+ PASTMEDICALHX 0.4000 1.0000 0.5714 4
39
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
40
+ PLAN 0.6667 0.6667 0.6667 3
41
+ PROCEDURES 0.0000 0.0000 0.0000 1
42
+ ROS 0.8750 0.6364 0.7368 11
43
+
44
+ accuracy 0.7300 100
45
+ macro avg 0.3869 0.4359 0.3817 100
46
+ weighted avg 0.6912 0.7300 0.6878 100
47
+
48
+
49
+ ## Model description
50
+
51
+ More information needed
52
+
53
+ ## Intended uses & limitations
54
+
55
+ More information needed
56
+
57
+ ## Training and evaluation data
58
+
59
+ More information needed
60
+
61
+ ## Training procedure
62
+
63
+ ### Training hyperparameters
64
+
65
+ The following hyperparameters were used during training:
66
+ - learning_rate: 0.0001
67
+ - train_batch_size: 7
68
+ - eval_batch_size: 7
69
+ - seed: 42
70
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
71
+ - lr_scheduler_type: linear
72
+ - num_epochs: 20
73
+
74
+ ### Training results
75
+
76
+ | Training Loss | Epoch | Step | Validation Loss | Classification Report |
77
+ |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
78
+ | No log | 1.0 | 172 | 0.3878 | precision recall f1-score support
79
+
80
+ ALLERGY 1.0000 0.2500 0.4000 4
81
+ ASSESSMENT 0.0000 0.0000 0.0000 4
82
+ CC 0.3000 0.7500 0.4286 4
83
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
84
+ DISPOSITION 0.0000 0.0000 0.0000 2
85
+ EDCOURSE 0.0000 0.0000 0.0000 3
86
+ EVALUATION 0.0000 0.0000 0.0000 0
87
+ EXAM 0.0000 0.0000 0.0000 1
88
+ FAM/SOCHX 0.8696 0.9091 0.8889 22
89
+ GENHX 0.5333 0.8000 0.6400 20
90
+ GYNHX 0.0000 0.0000 0.0000 1
91
+ IMAGING 0.0000 0.0000 0.0000 1
92
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
93
+ LABS 0.0000 0.0000 0.0000 1
94
+ MEDICATIONS 0.6250 0.7143 0.6667 7
95
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
96
+ PASTMEDICALHX 0.3333 1.0000 0.5000 4
97
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
98
+ PLAN 1.0000 0.3333 0.5000 3
99
+ PROCEDURES 0.0000 0.0000 0.0000 1
100
+ ROS 0.6667 0.1818 0.2857 11
101
+
102
+ accuracy 0.6100 100
103
+ macro avg 0.3489 0.3304 0.3005 100
104
+ weighted avg 0.6004 0.6100 0.5598 100
105
+ |
106
+ | No log | 2.0 | 344 | 0.3626 | precision recall f1-score support
107
+
108
+ ALLERGY 1.0000 0.2500 0.4000 4
109
+ ASSESSMENT 0.0000 0.0000 0.0000 4
110
+ CC 0.0000 0.0000 0.0000 4
111
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
112
+ DISPOSITION 0.0000 0.0000 0.0000 2
113
+ EDCOURSE 0.0000 0.0000 0.0000 3
114
+ EVALUATION 0.0000 0.0000 0.0000 0
115
+ EXAM 0.0000 0.0000 0.0000 1
116
+ FAM/SOCHX 0.9130 0.9545 0.9333 22
117
+ GENHX 0.7619 0.8000 0.7805 20
118
+ GYNHX 0.0000 0.0000 0.0000 1
119
+ IMAGING 0.0000 0.0000 0.0000 1
120
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
121
+ LABS 0.0000 0.0000 0.0000 1
122
+ MEDICATIONS 0.6667 0.8571 0.7500 7
123
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
124
+ PASTMEDICALHX 0.1818 1.0000 0.3077 4
125
+ PASTSURGICAL 0.8889 1.0000 0.9412 8
126
+ PLAN 1.0000 0.3333 0.5000 3
127
+ PROCEDURES 0.0000 0.0000 0.0000 1
128
+ ROS 0.5000 0.2727 0.3529 11
129
+
130
+ accuracy 0.6100 100
131
+ macro avg 0.3292 0.3080 0.2841 100
132
+ weighted avg 0.6133 0.6100 0.5814 100
133
+ |
134
+ | 0.2727 | 3.0 | 516 | 0.3517 | precision recall f1-score support
135
+
136
+ ALLERGY 1.0000 0.2500 0.4000 4
137
+ ASSESSMENT 0.0000 0.0000 0.0000 4
138
+ CC 0.2727 0.7500 0.4000 4
139
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
140
+ DISPOSITION 0.0000 0.0000 0.0000 2
141
+ EDCOURSE 0.0000 0.0000 0.0000 3
142
+ EVALUATION 0.0000 0.0000 0.0000 0
143
+ EXAM 0.0000 0.0000 0.0000 1
144
+ FAM/SOCHX 0.8696 0.9091 0.8889 22
145
+ GENHX 0.5714 0.8000 0.6667 20
146
+ GYNHX 0.0000 0.0000 0.0000 1
147
+ IMAGING 0.0000 0.0000 0.0000 1
148
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
149
+ LABS 0.0000 0.0000 0.0000 1
150
+ MEDICATIONS 0.7000 1.0000 0.8235 7
151
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
152
+ PASTMEDICALHX 0.3333 1.0000 0.5000 4
153
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
154
+ PLAN 1.0000 0.3333 0.5000 3
155
+ PROCEDURES 0.0000 0.0000 0.0000 1
156
+ ROS 0.6667 0.1818 0.2857 11
157
+
158
+ accuracy 0.6300 100
159
+ macro avg 0.3530 0.3440 0.3078 100
160
+ weighted avg 0.6122 0.6300 0.5750 100
161
+ |
162
+ | 0.2727 | 4.0 | 688 | 0.3393 | precision recall f1-score support
163
+
164
+ ALLERGY 1.0000 0.2500 0.4000 4
165
+ ASSESSMENT 0.0000 0.0000 0.0000 4
166
+ CC 0.2857 0.5000 0.3636 4
167
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
168
+ DISPOSITION 0.0000 0.0000 0.0000 2
169
+ EDCOURSE 0.0000 0.0000 0.0000 3
170
+ EVALUATIONS 0.0000 0.0000 0.0000 0
171
+ EXAM 0.5000 1.0000 0.6667 1
172
+ FAM/SOCHX 0.8462 1.0000 0.9167 22
173
+ GENHX 0.7619 0.8000 0.7805 20
174
+ GYNHX 0.0000 0.0000 0.0000 1
175
+ IMAGING 0.0000 0.0000 0.0000 1
176
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
177
+ LABS 0.0000 0.0000 0.0000 1
178
+ MEDICATIONS 0.7000 1.0000 0.8235 7
179
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
180
+ PASTMEDICALHX 0.2222 1.0000 0.3636 4
181
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
182
+ PLAN 1.0000 0.3333 0.5000 3
183
+ PROCEDURES 0.0000 0.0000 0.0000 1
184
+ ROS 0.7500 0.2727 0.4000 11
185
+
186
+ accuracy 0.6600 100
187
+ macro avg 0.3841 0.3884 0.3436 100
188
+ weighted avg 0.6554 0.6600 0.6162 100
189
+ |
190
+ | 0.2727 | 5.0 | 860 | 0.3178 | precision recall f1-score support
191
+
192
+ ALLERGY 1.0000 0.2500 0.4000 4
193
+ ASSESSMENT 0.0000 0.0000 0.0000 4
194
+ CC 0.2000 0.2500 0.2222 4
195
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
196
+ DISPOSITION 0.0000 0.0000 0.0000 2
197
+ EDCOURSE 0.0000 0.0000 0.0000 3
198
+ EVALUATION 0.0000 0.0000 0.0000 0
199
+ EXAM 0.5000 1.0000 0.6667 1
200
+ FAM/SOCHX 0.8462 1.0000 0.9167 22
201
+ GENHX 0.7619 0.8000 0.7805 20
202
+ GYNHX 0.0000 0.0000 0.0000 1
203
+ IMAGING 0.0000 0.0000 0.0000 1
204
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
205
+ LABS 0.0000 0.0000 0.0000 1
206
+ MEDICATIONS 0.7000 1.0000 0.8235 7
207
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
208
+ PASTMEDICALHX 0.2222 1.0000 0.3636 4
209
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
210
+ PLAN 1.0000 0.3333 0.5000 3
211
+ PROCEDURES 0.0000 0.0000 0.0000 1
212
+ ROS 0.6667 0.3636 0.4706 11
213
+
214
+ accuracy 0.6600 100
215
+ macro avg 0.3760 0.3808 0.3402 100
216
+ weighted avg 0.6428 0.6600 0.6183 100
217
+ |
218
+ | 0.2182 | 6.0 | 1032 | 0.3011 | precision recall f1-score support
219
+
220
+ ALLERGY 1.0000 0.2500 0.4000 4
221
+ ASSESSMENT 0.0000 0.0000 0.0000 4
222
+ CC 0.3750 0.7500 0.5000 4
223
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
224
+ DISPOSITION 0.0000 0.0000 0.0000 2
225
+ EDCOURSE 0.0000 0.0000 0.0000 3
226
+ EVALUATION 0.0000 0.0000 0.0000 0
227
+ EXAM 0.3333 1.0000 0.5000 1
228
+ FAM/SOCHX 0.8800 1.0000 0.9362 22
229
+ GENHX 0.6957 0.8000 0.7442 20
230
+ GYNHX 0.0000 0.0000 0.0000 1
231
+ IMAGING 0.0000 0.0000 0.0000 1
232
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
233
+ LABS 0.0000 0.0000 0.0000 1
234
+ MEDICATIONS 0.7778 1.0000 0.8750 7
235
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
236
+ PASTMEDICALHX 0.3333 1.0000 0.5000 4
237
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
238
+ PLAN 0.5000 0.3333 0.4000 3
239
+ PROCEDURES 0.0000 0.0000 0.0000 1
240
+ ROS 0.8000 0.3636 0.5000 11
241
+
242
+ accuracy 0.6800 100
243
+ macro avg 0.3664 0.4046 0.3503 100
244
+ weighted avg 0.6518 0.6800 0.6340 100
245
+ |
246
+ | 0.2182 | 7.0 | 1204 | 0.3220 | precision recall f1-score support
247
+
248
+ ALLERGY 1.0000 0.2500 0.4000 4
249
+ ASSESSMENT 0.0000 0.0000 0.0000 4
250
+ CC 0.0000 0.0000 0.0000 4
251
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
252
+ DISPOSITION 0.0000 0.0000 0.0000 2
253
+ EDCOURSE 0.0000 0.0000 0.0000 3
254
+ EVALUATION 0.0000 0.0000 0.0000 0
255
+ EXAM 0.5000 1.0000 0.6667 1
256
+ FAM/SOCHX 0.8462 1.0000 0.9167 22
257
+ GENHX 0.8889 0.8000 0.8421 20
258
+ GYNHX 0.0000 0.0000 0.0000 1
259
+ IMAGING 0.0000 0.0000 0.0000 1
260
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
261
+ LABS 0.0000 0.0000 0.0000 1
262
+ MEDICATIONS 0.7778 1.0000 0.8750 7
263
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
264
+ PASTMEDICALHX 0.2105 1.0000 0.3478 4
265
+ PASTSURGICAL 0.8000 1.0000 0.8889 8
266
+ PLAN 1.0000 0.3333 0.5000 3
267
+ PROCEDURES 0.0000 0.0000 0.0000 1
268
+ ROS 0.7500 0.5455 0.6316 11
269
+
270
+ accuracy 0.6700 100
271
+ macro avg 0.3702 0.3776 0.3366 100
272
+ weighted avg 0.6583 0.6700 0.6335 100
273
+ |
274
+ | 0.2182 | 8.0 | 1376 | 0.3121 | precision recall f1-score support
275
+
276
+ ALLERGY 1.0000 0.2500 0.4000 4
277
+ ASSESSMENT 0.0000 0.0000 0.0000 4
278
+ CC 0.2222 0.5000 0.3077 4
279
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
280
+ DISPOSITION 0.0000 0.0000 0.0000 2
281
+ EDCOURSE 0.0000 0.0000 0.0000 3
282
+ EVALUATION 0.0000 0.0000 0.0000 0
283
+ EXAM 0.5000 1.0000 0.6667 1
284
+ FAM/SOCHX 0.8462 1.0000 0.9167 22
285
+ GENHX 0.9412 0.8000 0.8649 20
286
+ GYNHX 0.0000 0.0000 0.0000 1
287
+ IMAGING 0.0000 0.0000 0.0000 1
288
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
289
+ LABS 0.0000 0.0000 0.0000 1
290
+ MEDICATIONS 0.7778 1.0000 0.8750 7
291
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
292
+ PASTMEDICALHX 0.2667 1.0000 0.4211 4
293
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
294
+ PLAN 1.0000 0.3333 0.5000 3
295
+ PROCEDURES 0.0000 0.0000 0.0000 1
296
+ ROS 0.7000 0.6364 0.6667 11
297
+
298
+ accuracy 0.7000 100
299
+ macro avg 0.3930 0.4057 0.3628 100
300
+ weighted avg 0.6904 0.7000 0.6660 100
301
+ |
302
+ | 0.1965 | 9.0 | 1548 | 0.3020 | precision recall f1-score support
303
+
304
+ ALLERGY 1.0000 0.2500 0.4000 4
305
+ ASSESSMENT 0.0000 0.0000 0.0000 4
306
+ CC 0.2727 0.7500 0.4000 4
307
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
308
+ DISPOSITION 0.0000 0.0000 0.0000 2
309
+ EDCOURSE 0.0000 0.0000 0.0000 3
310
+ EVALUATION 0.0000 0.0000 0.0000 0
311
+ EXAM 0.5000 1.0000 0.6667 1
312
+ FAM/SOCHX 0.8800 1.0000 0.9362 22
313
+ GENHX 0.8500 0.8500 0.8500 20
314
+ GYNHX 0.0000 0.0000 0.0000 1
315
+ IMAGING 0.0000 0.0000 0.0000 1
316
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
317
+ LABS 0.0000 0.0000 0.0000 1
318
+ MEDICATIONS 0.7778 1.0000 0.8750 7
319
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
320
+ PASTMEDICALHX 0.3077 1.0000 0.4706 4
321
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
322
+ PLAN 1.0000 0.3333 0.5000 3
323
+ PROCEDURES 0.0000 0.0000 0.0000 1
324
+ ROS 0.8571 0.5455 0.6667 11
325
+
326
+ accuracy 0.7100 100
327
+ macro avg 0.4022 0.4157 0.3698 100
328
+ weighted avg 0.7005 0.7100 0.6730 100
329
+ |
330
+ | 0.1965 | 10.0 | 1720 | 0.3122 | precision recall f1-score support
331
+
332
+ ALLERGY 1.0000 0.2500 0.4000 4
333
+ ASSESSMENT 0.0000 0.0000 0.0000 4
334
+ CC 0.2500 0.7500 0.3750 4
335
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
336
+ DISPOSITION 0.0000 0.0000 0.0000 2
337
+ EDCOURSE 0.0000 0.0000 0.0000 3
338
+ EVALUATION 0.0000 0.0000 0.0000 0
339
+ EXAM 0.5000 1.0000 0.6667 1
340
+ FAM/SOCHX 0.8462 1.0000 0.9167 22
341
+ GENHX 0.8947 0.8500 0.8718 20
342
+ GYNHX 0.0000 0.0000 0.0000 1
343
+ IMAGING 0.0000 0.0000 0.0000 1
344
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
345
+ LABS 0.0000 0.0000 0.0000 1
346
+ MEDICATIONS 0.7778 1.0000 0.8750 7
347
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
348
+ PASTMEDICALHX 0.3636 1.0000 0.5333 4
349
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
350
+ PLAN 1.0000 0.6667 0.8000 3
351
+ PROCEDURES 0.0000 0.0000 0.0000 1
352
+ ROS 0.8571 0.5455 0.6667 11
353
+
354
+ accuracy 0.7200 100
355
+ macro avg 0.4043 0.4315 0.3860 100
356
+ weighted avg 0.7034 0.7200 0.6836 100
357
+ |
358
+ | 0.1965 | 11.0 | 1892 | 0.3050 | precision recall f1-score support
359
+
360
+ ALLERGY 1.0000 0.2500 0.4000 4
361
+ ASSESSMENT 0.0000 0.0000 0.0000 4
362
+ CC 0.3333 0.5000 0.4000 4
363
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
364
+ DISPOSITION 0.0000 0.0000 0.0000 2
365
+ EDCOURSE 0.0000 0.0000 0.0000 3
366
+ EVALUATION 0.0000 0.0000 0.0000 0
367
+ EXAM 0.5000 1.0000 0.6667 1
368
+ FAM/SOCHX 0.8462 1.0000 0.9167 22
369
+ GENHX 0.8889 0.8000 0.8421 20
370
+ GYNHX 0.0000 0.0000 0.0000 1
371
+ IMAGING 0.0000 0.0000 0.0000 1
372
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
373
+ LABS 0.0000 0.0000 0.0000 1
374
+ MEDICATIONS 0.7778 1.0000 0.8750 7
375
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
376
+ PASTMEDICALHX 0.2857 1.0000 0.4444 4
377
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
378
+ PLAN 1.0000 0.6667 0.8000 3
379
+ PROCEDURES 0.0000 0.0000 0.0000 1
380
+ ROS 0.7000 0.6364 0.6667 11
381
+
382
+ accuracy 0.7100 100
383
+ macro avg 0.3968 0.4216 0.3815 100
384
+ weighted avg 0.6851 0.7100 0.6751 100
385
+ |
386
+ | 0.1834 | 12.0 | 2064 | 0.2901 | precision recall f1-score support
387
+
388
+ ALLERGY 1.0000 0.2500 0.4000 4
389
+ ASSESSMENT 0.0000 0.0000 0.0000 4
390
+ CC 0.3333 0.7500 0.4615 4
391
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
392
+ DISPOSITION 0.0000 0.0000 0.0000 2
393
+ EDCOURSE 0.0000 0.0000 0.0000 3
394
+ EVALUATION 0.0000 0.0000 0.0000 0
395
+ EXAM 0.5000 1.0000 0.6667 1
396
+ FAM/SOCHX 0.8800 1.0000 0.9362 22
397
+ GENHX 0.8095 0.8500 0.8293 20
398
+ GYNHX 0.0000 0.0000 0.0000 1
399
+ IMAGING 0.0000 0.0000 0.0000 1
400
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
401
+ LABS 0.0000 0.0000 0.0000 1
402
+ MEDICATIONS 0.7778 1.0000 0.8750 7
403
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
404
+ PASTMEDICALHX 0.3333 1.0000 0.5000 4
405
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
406
+ PLAN 1.0000 0.6667 0.8000 3
407
+ PROCEDURES 0.0000 0.0000 0.0000 1
408
+ ROS 0.8571 0.5455 0.6667 11
409
+
410
+ accuracy 0.7200 100
411
+ macro avg 0.4043 0.4315 0.3874 100
412
+ weighted avg 0.6959 0.7200 0.6815 100
413
+ |
414
+ | 0.1834 | 13.0 | 2236 | 0.2994 | precision recall f1-score support
415
+
416
+ ALLERGY 1.0000 0.2500 0.4000 4
417
+ ASSESSMENT 0.0000 0.0000 0.0000 4
418
+ CC 0.4286 0.7500 0.5455 4
419
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
420
+ DISPOSITION 0.0000 0.0000 0.0000 2
421
+ EDCOURSE 0.0000 0.0000 0.0000 3
422
+ EVALUATION 0.0000 0.0000 0.0000 0
423
+ EXAM 0.5000 1.0000 0.6667 1
424
+ FAM/SOCHX 0.8462 1.0000 0.9167 22
425
+ GENHX 0.8095 0.8500 0.8293 20
426
+ GYNHX 0.0000 0.0000 0.0000 1
427
+ IMAGING 0.0000 0.0000 0.0000 1
428
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
429
+ LABS 0.0000 0.0000 0.0000 1
430
+ MEDICATIONS 0.7778 1.0000 0.8750 7
431
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
432
+ PASTMEDICALHX 0.3636 1.0000 0.5333 4
433
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
434
+ PLAN 0.7500 1.0000 0.8571 3
435
+ PROCEDURES 0.0000 0.0000 0.0000 1
436
+ ROS 0.8750 0.6364 0.7368 11
437
+
438
+ accuracy 0.7400 100
439
+ macro avg 0.3977 0.4517 0.3981 100
440
+ weighted avg 0.6879 0.7400 0.6914 100
441
+ |
442
+ | 0.1834 | 14.0 | 2408 | 0.2951 | precision recall f1-score support
443
+
444
+ ALLERGY 1.0000 0.2500 0.4000 4
445
+ ASSESSMENT 0.0000 0.0000 0.0000 4
446
+ CC 0.3333 0.7500 0.4615 4
447
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
448
+ DISPOSITION 0.0000 0.0000 0.0000 2
449
+ EDCOURSE 0.0000 0.0000 0.0000 3
450
+ EVALUATION 0.0000 0.0000 0.0000 0
451
+ EXAM 0.2500 1.0000 0.4000 1
452
+ FAM/SOCHX 0.8462 1.0000 0.9167 22
453
+ GENHX 0.8500 0.8500 0.8500 20
454
+ GYNHX 0.0000 0.0000 0.0000 1
455
+ IMAGING 0.0000 0.0000 0.0000 1
456
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
457
+ LABS 0.0000 0.0000 0.0000 1
458
+ MEDICATIONS 0.7778 1.0000 0.8750 7
459
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
460
+ PASTMEDICALHX 0.4000 1.0000 0.5714 4
461
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
462
+ PLAN 0.7500 1.0000 0.8571 3
463
+ PROCEDURES 0.0000 0.0000 0.0000 1
464
+ ROS 0.8571 0.5455 0.6667 11
465
+
466
+ accuracy 0.7300 100
467
+ macro avg 0.3840 0.4474 0.3809 100
468
+ weighted avg 0.6892 0.7300 0.6833 100
469
+ |
470
+ | 0.1731 | 15.0 | 2580 | 0.2977 | precision recall f1-score support
471
+
472
+ ALLERGY 1.0000 0.2500 0.4000 4
473
+ ASSESSMENT 0.0000 0.0000 0.0000 4
474
+ CC 0.3333 0.7500 0.4615 4
475
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
476
+ DISPOSITION 0.0000 0.0000 0.0000 2
477
+ EDCOURSE 0.0000 0.0000 0.0000 3
478
+ EVALUATION 0.0000 0.0000 0.0000 0
479
+ EXAM 0.2000 1.0000 0.3333 1
480
+ FAM/SOCHX 0.8462 1.0000 0.9167 22
481
+ GENHX 0.8947 0.8500 0.8718 20
482
+ GYNHX 0.0000 0.0000 0.0000 1
483
+ IMAGING 0.0000 0.0000 0.0000 1
484
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
485
+ LABS 0.0000 0.0000 0.0000 1
486
+ MEDICATIONS 0.7778 1.0000 0.8750 7
487
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
488
+ PASTMEDICALHX 0.4000 1.0000 0.5714 4
489
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
490
+ PLAN 0.6667 0.6667 0.6667 3
491
+ PROCEDURES 0.0000 0.0000 0.0000 1
492
+ ROS 0.8750 0.6364 0.7368 11
493
+
494
+ accuracy 0.7300 100
495
+ macro avg 0.3807 0.4359 0.3730 100
496
+ weighted avg 0.6971 0.7300 0.6890 100
497
+ |
498
+ | 0.1731 | 16.0 | 2752 | 0.2936 | precision recall f1-score support
499
+
500
+ ALLERGY 1.0000 0.2500 0.4000 4
501
+ ASSESSMENT 0.0000 0.0000 0.0000 4
502
+ CC 0.3333 0.7500 0.4615 4
503
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
504
+ DISPOSITION 0.0000 0.0000 0.0000 2
505
+ EDCOURSE 0.0000 0.0000 0.0000 3
506
+ EVALUATION 0.0000 0.0000 0.0000 0
507
+ EXAM 0.3333 1.0000 0.5000 1
508
+ FAM/SOCHX 0.8462 1.0000 0.9167 22
509
+ GENHX 0.8947 0.8500 0.8718 20
510
+ GYNHX 0.0000 0.0000 0.0000 1
511
+ IMAGING 0.0000 0.0000 0.0000 1
512
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
513
+ LABS 0.0000 0.0000 0.0000 1
514
+ MEDICATIONS 0.7778 1.0000 0.8750 7
515
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
516
+ PASTMEDICALHX 0.3636 1.0000 0.5333 4
517
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
518
+ PLAN 0.5000 0.3333 0.4000 3
519
+ PROCEDURES 0.0000 0.0000 0.0000 1
520
+ ROS 0.8750 0.6364 0.7368 11
521
+
522
+ accuracy 0.7200 100
523
+ macro avg 0.3773 0.4200 0.3664 100
524
+ weighted avg 0.6920 0.7200 0.6811 100
525
+ |
526
+ | 0.1731 | 17.0 | 2924 | 0.2849 | precision recall f1-score support
527
+
528
+ ALLERGY 1.0000 0.2500 0.4000 4
529
+ ASSESSMENT 0.0000 0.0000 0.0000 4
530
+ CC 0.3750 0.7500 0.5000 4
531
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
532
+ DISPOSITION 0.0000 0.0000 0.0000 2
533
+ EDCOURSE 0.0000 0.0000 0.0000 3
534
+ EVALUATION 0.0000 0.0000 0.0000 0
535
+ EXAM 0.3333 1.0000 0.5000 1
536
+ FAM/SOCHX 0.8462 1.0000 0.9167 22
537
+ GENHX 0.9444 0.8500 0.8947 20
538
+ GYNHX 0.0000 0.0000 0.0000 1
539
+ IMAGING 0.0000 0.0000 0.0000 1
540
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
541
+ LABS 0.0000 0.0000 0.0000 1
542
+ MEDICATIONS 0.7778 1.0000 0.8750 7
543
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
544
+ PASTMEDICALHX 0.3333 1.0000 0.5000 4
545
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
546
+ PLAN 0.6667 0.6667 0.6667 3
547
+ PROCEDURES 0.0000 0.0000 0.0000 1
548
+ ROS 0.8750 0.6364 0.7368 11
549
+
550
+ accuracy 0.7300 100
551
+ macro avg 0.3882 0.4359 0.3805 100
552
+ weighted avg 0.7074 0.7300 0.6939 100
553
+ |
554
+ | 0.1546 | 18.0 | 3096 | 0.2875 | precision recall f1-score support
555
+
556
+ ALLERGY 1.0000 0.2500 0.4000 4
557
+ ASSESSMENT 0.0000 0.0000 0.0000 4
558
+ CC 0.4286 0.7500 0.5455 4
559
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
560
+ DISPOSITION 0.0000 0.0000 0.0000 2
561
+ EDCOURSE 0.0000 0.0000 0.0000 3
562
+ EVALUATION 0.0000 0.0000 0.0000 0
563
+ EXAM 0.3333 1.0000 0.5000 1
564
+ FAM/SOCHX 0.8462 1.0000 0.9167 22
565
+ GENHX 0.8095 0.8500 0.8293 20
566
+ GYNHX 0.0000 0.0000 0.0000 1
567
+ IMAGING 0.0000 0.0000 0.0000 1
568
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
569
+ LABS 0.0000 0.0000 0.0000 1
570
+ MEDICATIONS 0.7778 1.0000 0.8750 7
571
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
572
+ PASTMEDICALHX 0.3636 1.0000 0.5333 4
573
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
574
+ PLAN 0.5000 0.3333 0.4000 3
575
+ PROCEDURES 0.0000 0.0000 0.0000 1
576
+ ROS 0.8750 0.6364 0.7368 11
577
+
578
+ accuracy 0.7200 100
579
+ macro avg 0.3778 0.4200 0.3684 100
580
+ weighted avg 0.6788 0.7200 0.6760 100
581
+ |
582
+ | 0.1546 | 19.0 | 3268 | 0.2885 | precision recall f1-score support
583
+
584
+ ALLERGY 1.0000 0.2500 0.4000 4
585
+ ASSESSMENT 0.0000 0.0000 0.0000 4
586
+ CC 0.3750 0.7500 0.5000 4
587
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
588
+ DISPOSITION 0.0000 0.0000 0.0000 2
589
+ EDCOURSE 0.0000 0.0000 0.0000 3
590
+ EVALUATION 0.0000 0.0000 0.0000 0
591
+ EXAM 0.3333 1.0000 0.5000 1
592
+ FAM/SOCHX 0.8462 1.0000 0.9167 22
593
+ GENHX 0.8500 0.8500 0.8500 20
594
+ GYNHX 0.0000 0.0000 0.0000 1
595
+ IMAGING 0.0000 0.0000 0.0000 1
596
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
597
+ LABS 0.0000 0.0000 0.0000 1
598
+ MEDICATIONS 0.7778 1.0000 0.8750 7
599
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
600
+ PASTMEDICALHX 0.4000 1.0000 0.5714 4
601
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
602
+ PLAN 0.6667 0.6667 0.6667 3
603
+ PROCEDURES 0.0000 0.0000 0.0000 1
604
+ ROS 0.8750 0.6364 0.7368 11
605
+
606
+ accuracy 0.7300 100
607
+ macro avg 0.3869 0.4359 0.3817 100
608
+ weighted avg 0.6912 0.7300 0.6878 100
609
+ |
610
+ | 0.1546 | 20.0 | 3440 | 0.2878 | precision recall f1-score support
611
+
612
+ ALLERGY 1.0000 0.2500 0.4000 4
613
+ ASSESSMENT 0.0000 0.0000 0.0000 4
614
+ CC 0.3750 0.7500 0.5000 4
615
+ DIAGNOSIS 0.0000 0.0000 0.0000 1
616
+ DISPOSITION 0.0000 0.0000 0.0000 2
617
+ EDCOURSE 0.0000 0.0000 0.0000 3
618
+ EVALUATION 0.0000 0.0000 0.0000 0
619
+ EXAM 0.3333 1.0000 0.5000 1
620
+ FAM/SOCHX 0.8462 1.0000 0.9167 22
621
+ GENHX 0.8500 0.8500 0.8500 20
622
+ GYNHX 0.0000 0.0000 0.0000 1
623
+ IMAGING 0.0000 0.0000 0.0000 1
624
+ IMMUNIZATIONS 1.0000 1.0000 1.0000 1
625
+ LABS 0.0000 0.0000 0.0000 1
626
+ MEDICATIONS 0.7778 1.0000 0.8750 7
627
+ OTHER_HISTORY 0.0000 0.0000 0.0000 1
628
+ PASTMEDICALHX 0.4000 1.0000 0.5714 4
629
+ PASTSURGICAL 1.0000 1.0000 1.0000 8
630
+ PLAN 0.6667 0.6667 0.6667 3
631
+ PROCEDURES 0.0000 0.0000 0.0000 1
632
+ ROS 0.8750 0.6364 0.7368 11
633
+
634
+ accuracy 0.7300 100
635
+ macro avg 0.3869 0.4359 0.3817 100
636
+ weighted avg 0.6912 0.7300 0.6878 100
637
+ |
638
+
639
+
640
+ ### Framework versions
641
+
642
+ - PEFT 0.8.2
643
+ - Transformers 4.37.2
644
+ - Pytorch 2.1.0+cu121
645
+ - Datasets 2.17.1
646
+ - Tokenizers 0.15.2
adapter_config.json ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": null,
4
+ "base_model_name_or_path": "google/flan-t5-large",
5
+ "bias": "none",
6
+ "fan_in_fan_out": false,
7
+ "inference_mode": true,
8
+ "init_lora_weights": true,
9
+ "layers_pattern": null,
10
+ "layers_to_transform": null,
11
+ "loftq_config": {},
12
+ "lora_alpha": 16,
13
+ "lora_dropout": 0.05,
14
+ "megatron_config": null,
15
+ "megatron_core": "megatron.core",
16
+ "modules_to_save": null,
17
+ "peft_type": "LORA",
18
+ "r": 4,
19
+ "rank_pattern": {},
20
+ "revision": null,
21
+ "target_modules": [
22
+ "v",
23
+ "q"
24
+ ],
25
+ "task_type": "SEQ_2_SEQ_LM",
26
+ "use_rslora": false
27
+ }
adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7506047785b3fb7513c7300963870c1c7e25570224022e5bdbe295b6d6728ee0
3
+ size 4758888
runs/Feb24_23-42-43_7d7d10a092a9/events.out.tfevents.1708818168.7d7d10a092a9.1833.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9e5cb750268b07965597cb7a788fba5c77f8b7eb949ace09309bd9f96f4430c9
3
+ size 5938
runs/Feb24_23-53-59_7d7d10a092a9/events.out.tfevents.1708818843.7d7d10a092a9.1833.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b2d579a4415bb4b15463fbc5fd41ab2930023af0cad411f46b321f52ef1e7d67
3
+ size 11413
special_tokens_map.json ADDED
@@ -0,0 +1,125 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<extra_id_0>",
4
+ "<extra_id_1>",
5
+ "<extra_id_2>",
6
+ "<extra_id_3>",
7
+ "<extra_id_4>",
8
+ "<extra_id_5>",
9
+ "<extra_id_6>",
10
+ "<extra_id_7>",
11
+ "<extra_id_8>",
12
+ "<extra_id_9>",
13
+ "<extra_id_10>",
14
+ "<extra_id_11>",
15
+ "<extra_id_12>",
16
+ "<extra_id_13>",
17
+ "<extra_id_14>",
18
+ "<extra_id_15>",
19
+ "<extra_id_16>",
20
+ "<extra_id_17>",
21
+ "<extra_id_18>",
22
+ "<extra_id_19>",
23
+ "<extra_id_20>",
24
+ "<extra_id_21>",
25
+ "<extra_id_22>",
26
+ "<extra_id_23>",
27
+ "<extra_id_24>",
28
+ "<extra_id_25>",
29
+ "<extra_id_26>",
30
+ "<extra_id_27>",
31
+ "<extra_id_28>",
32
+ "<extra_id_29>",
33
+ "<extra_id_30>",
34
+ "<extra_id_31>",
35
+ "<extra_id_32>",
36
+ "<extra_id_33>",
37
+ "<extra_id_34>",
38
+ "<extra_id_35>",
39
+ "<extra_id_36>",
40
+ "<extra_id_37>",
41
+ "<extra_id_38>",
42
+ "<extra_id_39>",
43
+ "<extra_id_40>",
44
+ "<extra_id_41>",
45
+ "<extra_id_42>",
46
+ "<extra_id_43>",
47
+ "<extra_id_44>",
48
+ "<extra_id_45>",
49
+ "<extra_id_46>",
50
+ "<extra_id_47>",
51
+ "<extra_id_48>",
52
+ "<extra_id_49>",
53
+ "<extra_id_50>",
54
+ "<extra_id_51>",
55
+ "<extra_id_52>",
56
+ "<extra_id_53>",
57
+ "<extra_id_54>",
58
+ "<extra_id_55>",
59
+ "<extra_id_56>",
60
+ "<extra_id_57>",
61
+ "<extra_id_58>",
62
+ "<extra_id_59>",
63
+ "<extra_id_60>",
64
+ "<extra_id_61>",
65
+ "<extra_id_62>",
66
+ "<extra_id_63>",
67
+ "<extra_id_64>",
68
+ "<extra_id_65>",
69
+ "<extra_id_66>",
70
+ "<extra_id_67>",
71
+ "<extra_id_68>",
72
+ "<extra_id_69>",
73
+ "<extra_id_70>",
74
+ "<extra_id_71>",
75
+ "<extra_id_72>",
76
+ "<extra_id_73>",
77
+ "<extra_id_74>",
78
+ "<extra_id_75>",
79
+ "<extra_id_76>",
80
+ "<extra_id_77>",
81
+ "<extra_id_78>",
82
+ "<extra_id_79>",
83
+ "<extra_id_80>",
84
+ "<extra_id_81>",
85
+ "<extra_id_82>",
86
+ "<extra_id_83>",
87
+ "<extra_id_84>",
88
+ "<extra_id_85>",
89
+ "<extra_id_86>",
90
+ "<extra_id_87>",
91
+ "<extra_id_88>",
92
+ "<extra_id_89>",
93
+ "<extra_id_90>",
94
+ "<extra_id_91>",
95
+ "<extra_id_92>",
96
+ "<extra_id_93>",
97
+ "<extra_id_94>",
98
+ "<extra_id_95>",
99
+ "<extra_id_96>",
100
+ "<extra_id_97>",
101
+ "<extra_id_98>",
102
+ "<extra_id_99>"
103
+ ],
104
+ "eos_token": {
105
+ "content": "</s>",
106
+ "lstrip": false,
107
+ "normalized": false,
108
+ "rstrip": false,
109
+ "single_word": false
110
+ },
111
+ "pad_token": {
112
+ "content": "<pad>",
113
+ "lstrip": false,
114
+ "normalized": false,
115
+ "rstrip": false,
116
+ "single_word": false
117
+ },
118
+ "unk_token": {
119
+ "content": "<unk>",
120
+ "lstrip": false,
121
+ "normalized": false,
122
+ "rstrip": false,
123
+ "single_word": false
124
+ }
125
+ }
spiece.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d60acb128cf7b7f2536e8f38a5b18a05535c9e14c7a355904270e15b0945ea86
3
+ size 791656
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,938 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<pad>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "</s>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "<unk>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "32000": {
28
+ "content": "<extra_id_99>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "32001": {
36
+ "content": "<extra_id_98>",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ },
43
+ "32002": {
44
+ "content": "<extra_id_97>",
45
+ "lstrip": false,
46
+ "normalized": false,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": true
50
+ },
51
+ "32003": {
52
+ "content": "<extra_id_96>",
53
+ "lstrip": false,
54
+ "normalized": false,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": true
58
+ },
59
+ "32004": {
60
+ "content": "<extra_id_95>",
61
+ "lstrip": false,
62
+ "normalized": false,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": true
66
+ },
67
+ "32005": {
68
+ "content": "<extra_id_94>",
69
+ "lstrip": false,
70
+ "normalized": false,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": true
74
+ },
75
+ "32006": {
76
+ "content": "<extra_id_93>",
77
+ "lstrip": false,
78
+ "normalized": false,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": true
82
+ },
83
+ "32007": {
84
+ "content": "<extra_id_92>",
85
+ "lstrip": false,
86
+ "normalized": false,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": true
90
+ },
91
+ "32008": {
92
+ "content": "<extra_id_91>",
93
+ "lstrip": false,
94
+ "normalized": false,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": true
98
+ },
99
+ "32009": {
100
+ "content": "<extra_id_90>",
101
+ "lstrip": false,
102
+ "normalized": false,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": true
106
+ },
107
+ "32010": {
108
+ "content": "<extra_id_89>",
109
+ "lstrip": false,
110
+ "normalized": false,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": true
114
+ },
115
+ "32011": {
116
+ "content": "<extra_id_88>",
117
+ "lstrip": false,
118
+ "normalized": false,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": true
122
+ },
123
+ "32012": {
124
+ "content": "<extra_id_87>",
125
+ "lstrip": false,
126
+ "normalized": false,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": true
130
+ },
131
+ "32013": {
132
+ "content": "<extra_id_86>",
133
+ "lstrip": false,
134
+ "normalized": false,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": true
138
+ },
139
+ "32014": {
140
+ "content": "<extra_id_85>",
141
+ "lstrip": false,
142
+ "normalized": false,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": true
146
+ },
147
+ "32015": {
148
+ "content": "<extra_id_84>",
149
+ "lstrip": false,
150
+ "normalized": false,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": true
154
+ },
155
+ "32016": {
156
+ "content": "<extra_id_83>",
157
+ "lstrip": false,
158
+ "normalized": false,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": true
162
+ },
163
+ "32017": {
164
+ "content": "<extra_id_82>",
165
+ "lstrip": false,
166
+ "normalized": false,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": true
170
+ },
171
+ "32018": {
172
+ "content": "<extra_id_81>",
173
+ "lstrip": false,
174
+ "normalized": false,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": true
178
+ },
179
+ "32019": {
180
+ "content": "<extra_id_80>",
181
+ "lstrip": false,
182
+ "normalized": false,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": true
186
+ },
187
+ "32020": {
188
+ "content": "<extra_id_79>",
189
+ "lstrip": false,
190
+ "normalized": false,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": true
194
+ },
195
+ "32021": {
196
+ "content": "<extra_id_78>",
197
+ "lstrip": false,
198
+ "normalized": false,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": true
202
+ },
203
+ "32022": {
204
+ "content": "<extra_id_77>",
205
+ "lstrip": false,
206
+ "normalized": false,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": true
210
+ },
211
+ "32023": {
212
+ "content": "<extra_id_76>",
213
+ "lstrip": false,
214
+ "normalized": false,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": true
218
+ },
219
+ "32024": {
220
+ "content": "<extra_id_75>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "32025": {
228
+ "content": "<extra_id_74>",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "32026": {
236
+ "content": "<extra_id_73>",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "32027": {
244
+ "content": "<extra_id_72>",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "32028": {
252
+ "content": "<extra_id_71>",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "32029": {
260
+ "content": "<extra_id_70>",
261
+ "lstrip": false,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "32030": {
268
+ "content": "<extra_id_69>",
269
+ "lstrip": false,
270
+ "normalized": false,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": true
274
+ },
275
+ "32031": {
276
+ "content": "<extra_id_68>",
277
+ "lstrip": false,
278
+ "normalized": false,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": true
282
+ },
283
+ "32032": {
284
+ "content": "<extra_id_67>",
285
+ "lstrip": false,
286
+ "normalized": false,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": true
290
+ },
291
+ "32033": {
292
+ "content": "<extra_id_66>",
293
+ "lstrip": false,
294
+ "normalized": false,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": true
298
+ },
299
+ "32034": {
300
+ "content": "<extra_id_65>",
301
+ "lstrip": false,
302
+ "normalized": false,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": true
306
+ },
307
+ "32035": {
308
+ "content": "<extra_id_64>",
309
+ "lstrip": false,
310
+ "normalized": false,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": true
314
+ },
315
+ "32036": {
316
+ "content": "<extra_id_63>",
317
+ "lstrip": false,
318
+ "normalized": false,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": true
322
+ },
323
+ "32037": {
324
+ "content": "<extra_id_62>",
325
+ "lstrip": false,
326
+ "normalized": false,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": true
330
+ },
331
+ "32038": {
332
+ "content": "<extra_id_61>",
333
+ "lstrip": false,
334
+ "normalized": false,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": true
338
+ },
339
+ "32039": {
340
+ "content": "<extra_id_60>",
341
+ "lstrip": false,
342
+ "normalized": false,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": true
346
+ },
347
+ "32040": {
348
+ "content": "<extra_id_59>",
349
+ "lstrip": false,
350
+ "normalized": false,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": true
354
+ },
355
+ "32041": {
356
+ "content": "<extra_id_58>",
357
+ "lstrip": false,
358
+ "normalized": false,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": true
362
+ },
363
+ "32042": {
364
+ "content": "<extra_id_57>",
365
+ "lstrip": false,
366
+ "normalized": false,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": true
370
+ },
371
+ "32043": {
372
+ "content": "<extra_id_56>",
373
+ "lstrip": false,
374
+ "normalized": false,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": true
378
+ },
379
+ "32044": {
380
+ "content": "<extra_id_55>",
381
+ "lstrip": false,
382
+ "normalized": false,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": true
386
+ },
387
+ "32045": {
388
+ "content": "<extra_id_54>",
389
+ "lstrip": false,
390
+ "normalized": false,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": true
394
+ },
395
+ "32046": {
396
+ "content": "<extra_id_53>",
397
+ "lstrip": false,
398
+ "normalized": false,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": true
402
+ },
403
+ "32047": {
404
+ "content": "<extra_id_52>",
405
+ "lstrip": false,
406
+ "normalized": false,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": true
410
+ },
411
+ "32048": {
412
+ "content": "<extra_id_51>",
413
+ "lstrip": false,
414
+ "normalized": false,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": true
418
+ },
419
+ "32049": {
420
+ "content": "<extra_id_50>",
421
+ "lstrip": false,
422
+ "normalized": false,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": true
426
+ },
427
+ "32050": {
428
+ "content": "<extra_id_49>",
429
+ "lstrip": false,
430
+ "normalized": false,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": true
434
+ },
435
+ "32051": {
436
+ "content": "<extra_id_48>",
437
+ "lstrip": false,
438
+ "normalized": false,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": true
442
+ },
443
+ "32052": {
444
+ "content": "<extra_id_47>",
445
+ "lstrip": false,
446
+ "normalized": false,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": true
450
+ },
451
+ "32053": {
452
+ "content": "<extra_id_46>",
453
+ "lstrip": false,
454
+ "normalized": false,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": true
458
+ },
459
+ "32054": {
460
+ "content": "<extra_id_45>",
461
+ "lstrip": false,
462
+ "normalized": false,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": true
466
+ },
467
+ "32055": {
468
+ "content": "<extra_id_44>",
469
+ "lstrip": false,
470
+ "normalized": false,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": true
474
+ },
475
+ "32056": {
476
+ "content": "<extra_id_43>",
477
+ "lstrip": false,
478
+ "normalized": false,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": true
482
+ },
483
+ "32057": {
484
+ "content": "<extra_id_42>",
485
+ "lstrip": false,
486
+ "normalized": false,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": true
490
+ },
491
+ "32058": {
492
+ "content": "<extra_id_41>",
493
+ "lstrip": false,
494
+ "normalized": false,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": true
498
+ },
499
+ "32059": {
500
+ "content": "<extra_id_40>",
501
+ "lstrip": false,
502
+ "normalized": false,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": true
506
+ },
507
+ "32060": {
508
+ "content": "<extra_id_39>",
509
+ "lstrip": false,
510
+ "normalized": false,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": true
514
+ },
515
+ "32061": {
516
+ "content": "<extra_id_38>",
517
+ "lstrip": false,
518
+ "normalized": false,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": true
522
+ },
523
+ "32062": {
524
+ "content": "<extra_id_37>",
525
+ "lstrip": false,
526
+ "normalized": false,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": true
530
+ },
531
+ "32063": {
532
+ "content": "<extra_id_36>",
533
+ "lstrip": false,
534
+ "normalized": false,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": true
538
+ },
539
+ "32064": {
540
+ "content": "<extra_id_35>",
541
+ "lstrip": false,
542
+ "normalized": false,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": true
546
+ },
547
+ "32065": {
548
+ "content": "<extra_id_34>",
549
+ "lstrip": false,
550
+ "normalized": false,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": true
554
+ },
555
+ "32066": {
556
+ "content": "<extra_id_33>",
557
+ "lstrip": false,
558
+ "normalized": false,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": true
562
+ },
563
+ "32067": {
564
+ "content": "<extra_id_32>",
565
+ "lstrip": false,
566
+ "normalized": false,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": true
570
+ },
571
+ "32068": {
572
+ "content": "<extra_id_31>",
573
+ "lstrip": false,
574
+ "normalized": false,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": true
578
+ },
579
+ "32069": {
580
+ "content": "<extra_id_30>",
581
+ "lstrip": false,
582
+ "normalized": false,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": true
586
+ },
587
+ "32070": {
588
+ "content": "<extra_id_29>",
589
+ "lstrip": false,
590
+ "normalized": false,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": true
594
+ },
595
+ "32071": {
596
+ "content": "<extra_id_28>",
597
+ "lstrip": false,
598
+ "normalized": false,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": true
602
+ },
603
+ "32072": {
604
+ "content": "<extra_id_27>",
605
+ "lstrip": false,
606
+ "normalized": false,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": true
610
+ },
611
+ "32073": {
612
+ "content": "<extra_id_26>",
613
+ "lstrip": false,
614
+ "normalized": false,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": true
618
+ },
619
+ "32074": {
620
+ "content": "<extra_id_25>",
621
+ "lstrip": false,
622
+ "normalized": false,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": true
626
+ },
627
+ "32075": {
628
+ "content": "<extra_id_24>",
629
+ "lstrip": false,
630
+ "normalized": false,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": true
634
+ },
635
+ "32076": {
636
+ "content": "<extra_id_23>",
637
+ "lstrip": false,
638
+ "normalized": false,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": true
642
+ },
643
+ "32077": {
644
+ "content": "<extra_id_22>",
645
+ "lstrip": false,
646
+ "normalized": false,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": true
650
+ },
651
+ "32078": {
652
+ "content": "<extra_id_21>",
653
+ "lstrip": false,
654
+ "normalized": false,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": true
658
+ },
659
+ "32079": {
660
+ "content": "<extra_id_20>",
661
+ "lstrip": false,
662
+ "normalized": false,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": true
666
+ },
667
+ "32080": {
668
+ "content": "<extra_id_19>",
669
+ "lstrip": false,
670
+ "normalized": false,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": true
674
+ },
675
+ "32081": {
676
+ "content": "<extra_id_18>",
677
+ "lstrip": false,
678
+ "normalized": false,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": true
682
+ },
683
+ "32082": {
684
+ "content": "<extra_id_17>",
685
+ "lstrip": false,
686
+ "normalized": false,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": true
690
+ },
691
+ "32083": {
692
+ "content": "<extra_id_16>",
693
+ "lstrip": false,
694
+ "normalized": false,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": true
698
+ },
699
+ "32084": {
700
+ "content": "<extra_id_15>",
701
+ "lstrip": false,
702
+ "normalized": false,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": true
706
+ },
707
+ "32085": {
708
+ "content": "<extra_id_14>",
709
+ "lstrip": false,
710
+ "normalized": false,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": true
714
+ },
715
+ "32086": {
716
+ "content": "<extra_id_13>",
717
+ "lstrip": false,
718
+ "normalized": false,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": true
722
+ },
723
+ "32087": {
724
+ "content": "<extra_id_12>",
725
+ "lstrip": false,
726
+ "normalized": false,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": true
730
+ },
731
+ "32088": {
732
+ "content": "<extra_id_11>",
733
+ "lstrip": false,
734
+ "normalized": false,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": true
738
+ },
739
+ "32089": {
740
+ "content": "<extra_id_10>",
741
+ "lstrip": false,
742
+ "normalized": false,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": true
746
+ },
747
+ "32090": {
748
+ "content": "<extra_id_9>",
749
+ "lstrip": false,
750
+ "normalized": false,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": true
754
+ },
755
+ "32091": {
756
+ "content": "<extra_id_8>",
757
+ "lstrip": false,
758
+ "normalized": false,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": true
762
+ },
763
+ "32092": {
764
+ "content": "<extra_id_7>",
765
+ "lstrip": false,
766
+ "normalized": false,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": true
770
+ },
771
+ "32093": {
772
+ "content": "<extra_id_6>",
773
+ "lstrip": false,
774
+ "normalized": false,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": true
778
+ },
779
+ "32094": {
780
+ "content": "<extra_id_5>",
781
+ "lstrip": false,
782
+ "normalized": false,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": true
786
+ },
787
+ "32095": {
788
+ "content": "<extra_id_4>",
789
+ "lstrip": false,
790
+ "normalized": false,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": true
794
+ },
795
+ "32096": {
796
+ "content": "<extra_id_3>",
797
+ "lstrip": false,
798
+ "normalized": false,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": true
802
+ },
803
+ "32097": {
804
+ "content": "<extra_id_2>",
805
+ "lstrip": false,
806
+ "normalized": false,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": true
810
+ },
811
+ "32098": {
812
+ "content": "<extra_id_1>",
813
+ "lstrip": false,
814
+ "normalized": false,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": true
818
+ },
819
+ "32099": {
820
+ "content": "<extra_id_0>",
821
+ "lstrip": false,
822
+ "normalized": false,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": true
826
+ }
827
+ },
828
+ "additional_special_tokens": [
829
+ "<extra_id_0>",
830
+ "<extra_id_1>",
831
+ "<extra_id_2>",
832
+ "<extra_id_3>",
833
+ "<extra_id_4>",
834
+ "<extra_id_5>",
835
+ "<extra_id_6>",
836
+ "<extra_id_7>",
837
+ "<extra_id_8>",
838
+ "<extra_id_9>",
839
+ "<extra_id_10>",
840
+ "<extra_id_11>",
841
+ "<extra_id_12>",
842
+ "<extra_id_13>",
843
+ "<extra_id_14>",
844
+ "<extra_id_15>",
845
+ "<extra_id_16>",
846
+ "<extra_id_17>",
847
+ "<extra_id_18>",
848
+ "<extra_id_19>",
849
+ "<extra_id_20>",
850
+ "<extra_id_21>",
851
+ "<extra_id_22>",
852
+ "<extra_id_23>",
853
+ "<extra_id_24>",
854
+ "<extra_id_25>",
855
+ "<extra_id_26>",
856
+ "<extra_id_27>",
857
+ "<extra_id_28>",
858
+ "<extra_id_29>",
859
+ "<extra_id_30>",
860
+ "<extra_id_31>",
861
+ "<extra_id_32>",
862
+ "<extra_id_33>",
863
+ "<extra_id_34>",
864
+ "<extra_id_35>",
865
+ "<extra_id_36>",
866
+ "<extra_id_37>",
867
+ "<extra_id_38>",
868
+ "<extra_id_39>",
869
+ "<extra_id_40>",
870
+ "<extra_id_41>",
871
+ "<extra_id_42>",
872
+ "<extra_id_43>",
873
+ "<extra_id_44>",
874
+ "<extra_id_45>",
875
+ "<extra_id_46>",
876
+ "<extra_id_47>",
877
+ "<extra_id_48>",
878
+ "<extra_id_49>",
879
+ "<extra_id_50>",
880
+ "<extra_id_51>",
881
+ "<extra_id_52>",
882
+ "<extra_id_53>",
883
+ "<extra_id_54>",
884
+ "<extra_id_55>",
885
+ "<extra_id_56>",
886
+ "<extra_id_57>",
887
+ "<extra_id_58>",
888
+ "<extra_id_59>",
889
+ "<extra_id_60>",
890
+ "<extra_id_61>",
891
+ "<extra_id_62>",
892
+ "<extra_id_63>",
893
+ "<extra_id_64>",
894
+ "<extra_id_65>",
895
+ "<extra_id_66>",
896
+ "<extra_id_67>",
897
+ "<extra_id_68>",
898
+ "<extra_id_69>",
899
+ "<extra_id_70>",
900
+ "<extra_id_71>",
901
+ "<extra_id_72>",
902
+ "<extra_id_73>",
903
+ "<extra_id_74>",
904
+ "<extra_id_75>",
905
+ "<extra_id_76>",
906
+ "<extra_id_77>",
907
+ "<extra_id_78>",
908
+ "<extra_id_79>",
909
+ "<extra_id_80>",
910
+ "<extra_id_81>",
911
+ "<extra_id_82>",
912
+ "<extra_id_83>",
913
+ "<extra_id_84>",
914
+ "<extra_id_85>",
915
+ "<extra_id_86>",
916
+ "<extra_id_87>",
917
+ "<extra_id_88>",
918
+ "<extra_id_89>",
919
+ "<extra_id_90>",
920
+ "<extra_id_91>",
921
+ "<extra_id_92>",
922
+ "<extra_id_93>",
923
+ "<extra_id_94>",
924
+ "<extra_id_95>",
925
+ "<extra_id_96>",
926
+ "<extra_id_97>",
927
+ "<extra_id_98>",
928
+ "<extra_id_99>"
929
+ ],
930
+ "clean_up_tokenization_spaces": true,
931
+ "eos_token": "</s>",
932
+ "extra_ids": 100,
933
+ "model_max_length": 512,
934
+ "pad_token": "<pad>",
935
+ "sp_model_kwargs": {},
936
+ "tokenizer_class": "T5Tokenizer",
937
+ "unk_token": "<unk>"
938
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d06e40c42fc487566db8942fbbbc84a0e571fd5e9fe323e2472026507e5ab1d6
3
+ size 4856