Update README.md
Browse files
README.md
CHANGED
@@ -16,7 +16,7 @@ library_name: transformers
|
|
16 |
|
17 |
A language model distilled and finetuned from an adapted version of LLaMA2-7B following "Towards the Law of Capacity Gap in Distilling Language Models".
|
18 |
|
19 |
-
Outperforming a wide range of 3B competitors in GPT4 evaluation and
|
20 |
|
21 |
<img src="./teaser_b.jpg" alt="teaser_b" width="687" />
|
22 |
|
|
|
16 |
|
17 |
A language model distilled and finetuned from an adapted version of LLaMA2-7B following "Towards the Law of Capacity Gap in Distilling Language Models".
|
18 |
|
19 |
+
Outperforming a wide range of 3B competitors in GPT4 evaluation and even competing with several 7B chat models.
|
20 |
|
21 |
<img src="./teaser_b.jpg" alt="teaser_b" width="687" />
|
22 |
|