metadata
license: apache-2.0
language:
- zh
- en
Chinese-Mixtral-Instruct-LoRA
This repository contains Chinese-Mixtral-Instruct-LoRA, which is further tuned with instruction data on Chinese-Mixtral, where Chinese-Mixtral is build on top of Mixtral-8x7B-v0.1.
Note: You must combine LoRA with the original Mixtral-8x7B-v0.1 to obtain full weight.
For full model, please see: https://huggingface.co/hfl/chinese-mixtral-instruct
For GGUF model (llama.cpp compatible), please see: https://huggingface.co/hfl/chinese-mixtral-instruct-gguf
Please refer to https://github.com/ymcui/Chinese-Mixtral/ for more details.