Ningyu commited on
Commit
4e87a9b
β€’
1 Parent(s): 44f6cc5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -378,7 +378,7 @@ pip install -r requirements.txt
378
 
379
  <h3 id="2-2">2.2 Pretraining model weight acquisition and restoration</h3>
380
 
381
- ❗❗❗ Note that in terms of hardware, performing step `2.2`, which involves merging LLaMA-13B with ZhiXI-13B-Diff, requires approximately **100GB** of RAM, with no demand for VRAM (this is due to the memory overhead caused by our merging strategy. To facilitate usage, we will improve our merging approach in future updates, and we are currently developing a 7B model as well, so stay tuned). For step `2.4`, which involves inference using `ZhiXi`, a minimum of **26GB** of VRAM is required.
382
 
383
  **1. Download LLaMA 13B and ZhiXi-13B-Diff**
384
 
 
378
 
379
  <h3 id="2-2">2.2 Pretraining model weight acquisition and restoration</h3>
380
 
381
+ ❗❗❗ Note that in terms of hardware, performing step `2.2`, which involves merging LLaMA-13B with ZhiXI-13B-Diff, requires approximately **100GB** of RAM, with no demand for VRAM (this is due to the memory overhead caused by our merging strategy. For your convenience, we have provided the fp16 weights at this link: https://huggingface.co/zjunlp/zhixi-13b-diff-fp16. **fp16 weights require less memory but may slightly impact performance**. We will improve our merging approach in future updates, and we are currently developing a 7B model as well, so stay tuned). For step `2.4`, which involves inference using `ZhiXi`, a minimum of **26GB** of VRAM is required.
382
 
383
  **1. Download LLaMA 13B and ZhiXi-13B-Diff**
384