Lewdiculous
commited on
Commit
•
a1831bc
1
Parent(s):
5fd6dd3
Update format.
Browse files
README.md
CHANGED
@@ -15,13 +15,11 @@ tags:
|
|
15 |
|
16 |
# #Roleplay #Multimodal #Vision
|
17 |
|
|
|
|
|
18 |
> [!TIP]
|
19 |
> This is a **#multimodal** model that also has optional **#vision** capabilities. <br> Expand the relevant sections bellow and read the full card information if you also want to make use that functionality.
|
20 |
|
21 |
-
This repository hosts GGUF-IQ-Imatrix quants for [ChaoticNeutrals/Nyanade_Stunna-Maid-7B-v0.2](https://huggingface.co/ChaoticNeutrals/Nyanade_Stunna-Maid-7B-v0.2).
|
22 |
-
|
23 |
-
Some recommended simple SillyTavern presets can be found [**here**](https://huggingface.co/Lewdiculous/Model-Requests/tree/main/data/presets/lewdicu-3.0.2-mistral-0.2) if needed.
|
24 |
-
|
25 |
"Unhinged RP with the spice of the previous 0.420 remixes, 32k context and vision capabilities."
|
26 |
|
27 |
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/642265bc01c62c1e4102dc36/_xbYLtGQIwCyjlGlVQpVx.jpeg)
|
@@ -88,15 +86,16 @@ The idea is to preserve the most important information during quantization, whic
|
|
88 |
|
89 |
# Required for vision functionality:
|
90 |
|
91 |
-
|
|
|
92 |
|
93 |
-
|
94 |
|
95 |
-
|
96 |
|
97 |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/65d4cf2693a0a3744a27536c/3bAsQJsSp69dHbe7sxxem.png)
|
98 |
|
99 |
-
|
100 |
|
101 |
```
|
102 |
--mmproj your-mmproj-file.gguf
|
|
|
15 |
|
16 |
# #Roleplay #Multimodal #Vision
|
17 |
|
18 |
+
In this repository you can find **GGUF-IQ-Imatrix** quants for [ChaoticNeutrals/Nyanade_Stunna-Maid-7B-v0.2](https://huggingface.co/ChaoticNeutrals/Nyanade_Stunna-Maid-7B-v0.2) and you can get some basic SillyTavern presets [here](https://huggingface.co/Lewdiculous/Model-Requests/tree/main/data/presets/lewdicu-3.0.2-mistral-0.2) as needed.
|
19 |
+
|
20 |
> [!TIP]
|
21 |
> This is a **#multimodal** model that also has optional **#vision** capabilities. <br> Expand the relevant sections bellow and read the full card information if you also want to make use that functionality.
|
22 |
|
|
|
|
|
|
|
|
|
23 |
"Unhinged RP with the spice of the previous 0.420 remixes, 32k context and vision capabilities."
|
24 |
|
25 |
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/642265bc01c62c1e4102dc36/_xbYLtGQIwCyjlGlVQpVx.jpeg)
|
|
|
86 |
|
87 |
# Required for vision functionality:
|
88 |
|
89 |
+
> [!WARNING]
|
90 |
+
> To use the multimodal capabilities of this model, such as **vision**, you also need to load the specified **mmproj** file, you can get it [here](https://huggingface.co/cjpais/llava-1.6-mistral-7b-gguf/blob/main/mmproj-model-f16.gguf) or as uploaded in the **mmproj** folder in the repository.
|
91 |
|
92 |
+
1: Make sure you are using the latest version of [KoboldCpp](https://github.com/LostRuins/koboldcpp).
|
93 |
|
94 |
+
2: Load the **mmproj file** by using the corresponding section in the interface:
|
95 |
|
96 |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/65d4cf2693a0a3744a27536c/3bAsQJsSp69dHbe7sxxem.png)
|
97 |
|
98 |
+
2.1: For **CLI** users, you can load the **mmproj file** by adding the respective flag to your usual command:
|
99 |
|
100 |
```
|
101 |
--mmproj your-mmproj-file.gguf
|