docs: add demo
Browse files
README.md
CHANGED
@@ -14,7 +14,8 @@ widget:
|
|
14 |
---
|
15 |
# Chinese Llama 2 7B
|
16 |
|
17 |
-
|
|
|
18 |
|
19 |
![Chinese LLaMA2 7B](.github/preview.jpg)
|
20 |
|
@@ -23,11 +24,18 @@ widget:
|
|
23 |
> Talk is cheap, Show you the Demo.
|
24 |
|
25 |
- [Demo 地址 / HuggingFace Spaces](https://huggingface.co/spaces/LinkSoul/Chinese-Llama-2-7b)
|
26 |
-
-
|
27 |
-
|
28 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
29 |
|
30 |
## 快速测试
|
|
|
31 |
```python
|
32 |
from transformers import AutoTokenizer, AutoModelForCausalLM, TextStreamer
|
33 |
|
@@ -41,18 +49,10 @@ instruction = """[INST] <<SYS>>\nYou are a helpful, respectful and honest assist
|
|
41 |
|
42 |
If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.\n<</SYS>>\n\n{} [/INST]"""
|
43 |
|
44 |
-
prompt = instruction.format("
|
45 |
generate_ids = model.generate(tokenizer(prompt, return_tensors='pt').input_ids.cuda(), max_new_tokens=4096, streamer=streamer)
|
46 |
```
|
47 |
|
48 |
-
## 资源下载
|
49 |
-
|
50 |
-
- 模型下载:[Chinese Llama2 Chat Model](https://huggingface.co/LinkSoul/Chinese-Llama-2-7b)
|
51 |
-
|
52 |
-
> 我们使用了中英文 SFT 数据集,数据量 1000 万。
|
53 |
-
|
54 |
-
- 数据集:[https://huggingface.co/datasets/LinkSoul/instruction_merge_set](https://huggingface.co/datasets/LinkSoul/instruction_merge_set)
|
55 |
-
|
56 |
## 如何训练
|
57 |
|
58 |
```bash
|
@@ -61,9 +61,8 @@ python train.py --args ...
|
|
61 |
|
62 |
## 相关项目
|
63 |
|
64 |
-
- [Llama2](
|
65 |
|
66 |
## 项目协议
|
67 |
|
68 |
-
|
69 |
[Apache-2.0 license](https://github.com/LinkSoul-AI/Chinese-Llama-2-7b/blob/main/LICENSE)
|
|
|
14 |
---
|
15 |
# Chinese Llama 2 7B
|
16 |
|
17 |
+
|
18 |
+
全部开源,完全可商用的**中文版 Llama2 模型及中英文 SFT 数据集**,输入格式严格遵循 *llama-2-chat* 格式,兼容适配所有针对原版 *llama-2-chat* 模型的优化。
|
19 |
|
20 |
![Chinese LLaMA2 7B](.github/preview.jpg)
|
21 |
|
|
|
24 |
> Talk is cheap, Show you the Demo.
|
25 |
|
26 |
- [Demo 地址 / HuggingFace Spaces](https://huggingface.co/spaces/LinkSoul/Chinese-Llama-2-7b)
|
27 |
+
- [Colab 一键启动](#) // 正在准备
|
28 |
+
|
29 |
+
## 资源下载
|
30 |
+
|
31 |
+
- 模型下载:[Chinese Llama2 Chat Model](https://huggingface.co/LinkSoul/Chinese-Llama-2-7b)
|
32 |
+
|
33 |
+
> 我们使用了中英文 SFT 数据集,数据量 1000 万。
|
34 |
+
|
35 |
+
- 数据集:[https://huggingface.co/datasets/LinkSoul/instruction_merge_set](https://huggingface.co/datasets/LinkSoul/instruction_merge_set)
|
36 |
|
37 |
## 快速测试
|
38 |
+
|
39 |
```python
|
40 |
from transformers import AutoTokenizer, AutoModelForCausalLM, TextStreamer
|
41 |
|
|
|
49 |
|
50 |
If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.\n<</SYS>>\n\n{} [/INST]"""
|
51 |
|
52 |
+
prompt = instruction.format("用英文回答,什么是夫妻肺片?")
|
53 |
generate_ids = model.generate(tokenizer(prompt, return_tensors='pt').input_ids.cuda(), max_new_tokens=4096, streamer=streamer)
|
54 |
```
|
55 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
56 |
## 如何训练
|
57 |
|
58 |
```bash
|
|
|
61 |
|
62 |
## 相关项目
|
63 |
|
64 |
+
- [Llama2](https://ai.meta.com/llama/)
|
65 |
|
66 |
## 项目协议
|
67 |
|
|
|
68 |
[Apache-2.0 license](https://github.com/LinkSoul-AI/Chinese-Llama-2-7b/blob/main/LICENSE)
|