DrNicefellow
commited on
Commit
•
f06c62b
1
Parent(s):
64b447a
Update README.md
Browse files
README.md
CHANGED
@@ -5,7 +5,7 @@ license: apache-2.0
|
|
5 |
|
6 |
## Model Description
|
7 |
|
8 |
-
This model is the
|
9 |
|
10 |
## Model Architecture
|
11 |
|
|
|
5 |
|
6 |
## Model Description
|
7 |
|
8 |
+
This model is the 1st extracted standalone model from the [mistralai/Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1), using the [Mixtral Model Expert Extractor tool](https://github.com/MeNicefellow/Mixtral-Model-Expert-Extractor) I made. It is constructed by selecting the first expert from each Mixture of Experts (MoE) layer. The extraction of this model is experimental. It is expected to be worse than Mistral-7B.
|
9 |
|
10 |
## Model Architecture
|
11 |
|