frankminors123
commited on
Commit
•
49ad1ea
1
Parent(s):
350ca19
Update README.md
Browse files
README.md
CHANGED
@@ -9,12 +9,12 @@ language:
|
|
9 |
We added [7k+ python code instructions](https://huggingface.co/datasets/frankminors123/Python-Code-Instructions-7k) and implemented SFT based on our [Chinese-CodeLlama-7B-SFT-V1](https://huggingface.co/frankminors123/Chinese-CodeLlama-7B-SFT-V1). Drawing on the work of [code-llama](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/), we increased
|
10 |
the base period of rotary positional embeddings (RoPE) from 10000 to 1000000.
|
11 |
|
|
|
|
|
12 |
The Chinese prompt template used is as follows:
|
13 |
```python
|
14 |
PROMPT_TEMPLATE = (
|
15 |
"下面是描述一项任务的指令,并且与一则输入配对用来提供更多的上下文。请给出尽可能满足请求的回答.\n"
|
16 |
"### 指令:\n{instruction}\n### 输入:\n{input}\n### 回答:\n"
|
17 |
)
|
18 |
-
```
|
19 |
-
|
20 |
-
We use a sequence length of 1k for pre-training, and continue training based on this length during the fine-tuning stage. Based on a larger base period of RoPE, it can support up 15k context length extrapolation at inference time.
|
|
|
9 |
We added [7k+ python code instructions](https://huggingface.co/datasets/frankminors123/Python-Code-Instructions-7k) and implemented SFT based on our [Chinese-CodeLlama-7B-SFT-V1](https://huggingface.co/frankminors123/Chinese-CodeLlama-7B-SFT-V1). Drawing on the work of [code-llama](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/), we increased
|
10 |
the base period of rotary positional embeddings (RoPE) from 10000 to 1000000.
|
11 |
|
12 |
+
We use a sequence length of 1k for pre-training, and continue training based on this length during the fine-tuning stage. Based on a larger base period of RoPE, it can support up 15k context length extrapolation at inference time.
|
13 |
+
|
14 |
The Chinese prompt template used is as follows:
|
15 |
```python
|
16 |
PROMPT_TEMPLATE = (
|
17 |
"下面是描述一项任务的指令,并且与一则输入配对用来提供更多的上下文。请给出尽可能满足请求的回答.\n"
|
18 |
"### 指令:\n{instruction}\n### 输入:\n{input}\n### 回答:\n"
|
19 |
)
|
20 |
+
```
|
|
|
|