Update README.md
Browse files
README.md
CHANGED
@@ -60,7 +60,7 @@ Throughout the training process, we encountered various issues such as machine c
|
|
60 |
|
61 |
Here are comparisons of the Ziya-LLaMA-13B-Pretrain-v1 model and the LLaMA model before continual pre-training, evaluated on the English benchmark (HeLM), and our Chinese multiple-choice evaluation datasets.
|
62 |
|
63 |
-
<img src="https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1/
|
64 |
|
65 |
|
66 |
| Model | Meanwin_rate | MMLU | BoolQ | NarrativeQA | NaturalQuestion(closed-book) | NaturalQuestion(open-book) | QuAC | TruthfulQA | IMDB |
|
@@ -68,7 +68,7 @@ Here are comparisons of the Ziya-LLaMA-13B-Pretrain-v1 model and the LLaMA model
|
|
68 |
| LLaMA-13B | 0.500 | 0.424 | 0.718 | 0.440 | 0.349 | 0.591 | 0.318 | 0.326 | 0.487 |
|
69 |
| Ziya-LLaMA-13B-Pretrain-v1 | 0.650 | 0.433 | 0.753 | 0.445 | 0.348 | 0.528 | 0.335 | 0.249 | 0.497 |
|
70 |
|
71 |
-
<img src="https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1/
|
72 |
|
73 |
| 模型 | incontext | c3 | 常识 | 语文 | 数学 | 英语 | 物理 | 化学 | 生物 | 历史 | 政治 | 地理 |
|
74 |
|-------------------------|------------|--------|--------|--------|--------|--------|--------|--------|--------|--------|--------|--------|
|
|
|
60 |
|
61 |
Here are comparisons of the Ziya-LLaMA-13B-Pretrain-v1 model and the LLaMA model before continual pre-training, evaluated on the English benchmark (HeLM), and our Chinese multiple-choice evaluation datasets.
|
62 |
|
63 |
+
<img src="https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1/resolve/main/ziya_en_eval.png" width=2542 height=1045>
|
64 |
|
65 |
|
66 |
| Model | Meanwin_rate | MMLU | BoolQ | NarrativeQA | NaturalQuestion(closed-book) | NaturalQuestion(open-book) | QuAC | TruthfulQA | IMDB |
|
|
|
68 |
| LLaMA-13B | 0.500 | 0.424 | 0.718 | 0.440 | 0.349 | 0.591 | 0.318 | 0.326 | 0.487 |
|
69 |
| Ziya-LLaMA-13B-Pretrain-v1 | 0.650 | 0.433 | 0.753 | 0.445 | 0.348 | 0.528 | 0.335 | 0.249 | 0.497 |
|
70 |
|
71 |
+
<img src="https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1/resolve/main/ziya_zh_eval.png" width=2340 height=1523>
|
72 |
|
73 |
| 模型 | incontext | c3 | 常识 | 语文 | 数学 | 英语 | 物理 | 化学 | 生物 | 历史 | 政治 | 地理 |
|
74 |
|-------------------------|------------|--------|--------|--------|--------|--------|--------|--------|--------|--------|--------|--------|
|