|
--- |
|
license: apache-2.0 |
|
tags: |
|
- moe |
|
language: |
|
- en |
|
library_name: transformers |
|
--- |
|
|
|
|
|
# Model Card for Model ID |
|
|
|
This is a mixture of experts created with [mergekit](https://github.com/cg123/mergekit) and based on [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1). |
|
## Model Details |
|
|
|
### Model Description |
|
|
|
<!-- Provide a longer summary of what this model is. --> |
|
|
|
|
|
|
|
- **Developed by:** [The Kaitchup](https://kaitchup.substack.com/) |
|
- **Model type:** Causal |
|
- **Language(s) (NLP):** English |
|
- **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) |
|
|
|
|
|
The method and code used to quantize the model is explained here: |
|
[Maixtchup: Make Your Own Mixture of Experts with Mergekit](https://kaitchup.substack.com/p/maixtchup-make-your-own-mixture-of) |
|
|
|
## Uses |
|
|
|
This model is pre-trained and not fine-tuned. You may fine-tune it with PEFT using adapters. |
|
|
|
|
|
## Model Card Contact |
|
|
|
[The Kaitchup](https://kaitchup.substack.com/) |