Update README.md
Browse files
README.md
CHANGED
@@ -9,9 +9,10 @@ tags:
|
|
9 |
Brunhilde-2x7b-MOE-DPO-v.01.5 is a Mixure of Experts (MoE).
|
10 |
* [NurtureAI/neural-chat-7b-v3-16k](https://huggingface.co/NurtureAI/neural-chat-7b-v3-16k)
|
11 |
* [mncai/mistral-7b-dpo-v6](https://huggingface.co/mncai/mistral-7b-dpo-v6)
|
12 |
-
```
|
13 |
|
14 |
-
|
|
|
|
|
15 |
!pip install -qU transformers bitsandbytes accelerate
|
16 |
|
17 |
from transformers import AutoTokenizer
|
|
|
9 |
Brunhilde-2x7b-MOE-DPO-v.01.5 is a Mixure of Experts (MoE).
|
10 |
* [NurtureAI/neural-chat-7b-v3-16k](https://huggingface.co/NurtureAI/neural-chat-7b-v3-16k)
|
11 |
* [mncai/mistral-7b-dpo-v6](https://huggingface.co/mncai/mistral-7b-dpo-v6)
|
|
|
12 |
|
13 |
+
## Usage
|
14 |
+
|
15 |
+
```
|
16 |
!pip install -qU transformers bitsandbytes accelerate
|
17 |
|
18 |
from transformers import AutoTokenizer
|