metadata
license: apache-2.0
language:
- zh
- en
tags:
- moe
Chinese-Mixtral-Instruct
Chinese Mixtral GitHub repository: https://github.com/ymcui/Chinese-Mixtral
This repository contains Chinese-Mixtral-Instruct, which is further tuned with instruction data on Chinese-Mixtral, where Chinese-Mixtral is build on top of Mixtral-8x7B-v0.1.
Note: this is an instruction (chat) model, which can be used for conversation, QA, etc.
Others
For LoRA-only model, please see: https://huggingface.co/hfl/chinese-mixtral-instruct-lora
For GGUF model (llama.cpp compatible), please see: https://huggingface.co/hfl/chinese-mixtral-instruct-gguf
If you have questions/issues regarding this model, please submit an issue through https://github.com/ymcui/Chinese-Mixtral/.