|
--- |
|
license: apache-2.0 |
|
language: |
|
- zh |
|
- en |
|
library_name: transformers |
|
pipeline_tag: text-generation |
|
--- |
|
原始模型:https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-GPTQ |
|
|
|
lora:https://huggingface.co/ziqingyang/chinese-alpaca-lora-13b |
|
|
|
使用项目: |
|
https://github.com/ymcui/Chinese-LLaMA-Alpaca |
|
|
|
https://github.com/qwopqwop200/GPTQ-for-LLaMa |
|
|
|
**兼容AutoGPTQ和GPTQ-for-LLaMa** |
|
**若选择GPTQ-for-LLaMa加载,请设置 Wbits=4 groupsize=128 model_type=llama** |
|
|
|
**Compatible with AutoGPTQ and GPTQ-for-LLaMa** |
|
**If selecting GPTQ-for-LLaMa loading, please set Wbits=4 groupsize=128 model_type=llama** |
|
|
|
Text-generation-webui懒人包: |
|
https://www.bilibili.com/read/cv23495183 |