chiliu
commited on
Commit
•
21a8212
1
Parent(s):
b83fa45
fix
Browse files
README.md
CHANGED
@@ -186,14 +186,14 @@ The original LLaMA model was trained for 1 trillion tokens and GPT-J was trained
|
|
186 |
| arc_easy/acc_norm | 0.65 | 0.65 |
|
187 |
| boolq/acc | **0.72** | 0.66 |
|
188 |
| hellaswag/acc | **0.49** | 0.43 |
|
189 |
-
| hellaswag/acc_norm | 0.66 | 0.67 |
|
190 |
-
| openbookqa/acc | 0.26 | 0.27 |
|
191 |
| openbookqa/acc_norm | 0.40 | 0.40 |
|
192 |
| piqa/acc | **0.76** | 0.75 |
|
193 |
| piqa/acc_norm | 0.76 | 0.76 |
|
194 |
| record/em | 0.88 | 0.88 |
|
195 |
-
| record/f1 | 0.88 | 0.89 |
|
196 |
-
| rte/acc | 0.55 | 0.58 |
|
197 |
| truthfulqa_mc/mc1 | **0.27** | 0.22 |
|
198 |
| truthfulqa_mc/mc2 | **0.37** | 0.35 |
|
199 |
| wic/acc | **0.49** | 0.48 |
|
|
|
186 |
| arc_easy/acc_norm | 0.65 | 0.65 |
|
187 |
| boolq/acc | **0.72** | 0.66 |
|
188 |
| hellaswag/acc | **0.49** | 0.43 |
|
189 |
+
| hellaswag/acc_norm | 0.66 | **0.67** |
|
190 |
+
| openbookqa/acc | 0.26 | **0.27** |
|
191 |
| openbookqa/acc_norm | 0.40 | 0.40 |
|
192 |
| piqa/acc | **0.76** | 0.75 |
|
193 |
| piqa/acc_norm | 0.76 | 0.76 |
|
194 |
| record/em | 0.88 | 0.88 |
|
195 |
+
| record/f1 | 0.88 | **0.89** |
|
196 |
+
| rte/acc | 0.55 | **0.58** |
|
197 |
| truthfulqa_mc/mc1 | **0.27** | 0.22 |
|
198 |
| truthfulqa_mc/mc2 | **0.37** | 0.35 |
|
199 |
| wic/acc | **0.49** | 0.48 |
|