Upload README.md
Browse files
README.md
CHANGED
@@ -134,15 +134,11 @@ Refer to the Provided Files table below to see what files use which methods, and
|
|
134 |
| Name | Quant method | Bits | Size | Max RAM required | Use case |
|
135 |
| ---- | ---- | ---- | ---- | ---- | ----- |
|
136 |
| [openbuddy-mixtral-8x7b-v15.1.Q2_K.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q2_K.gguf) | Q2_K | 2 | 15.67 GB| 18.17 GB | smallest, significant quality loss - not recommended for most purposes |
|
137 |
-
| [openbuddy-mixtral-8x7b-v15.1.Q3_K_S.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q3_K_S.gguf) | Q3_K_S | 3 | 20.32 GB| 22.82 GB | very small, high quality loss |
|
138 |
| [openbuddy-mixtral-8x7b-v15.1.Q3_K_M.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q3_K_M.gguf) | Q3_K_M | 3 | 20.39 GB| 22.89 GB | very small, high quality loss |
|
139 |
-
| [openbuddy-mixtral-8x7b-v15.1.Q3_K_L.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q3_K_L.gguf) | Q3_K_L | 3 | 20.46 GB| 22.96 GB | small, substantial quality loss |
|
140 |
| [openbuddy-mixtral-8x7b-v15.1.Q4_0.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q4_0.gguf) | Q4_0 | 4 | 26.47 GB| 28.97 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
|
141 |
| [openbuddy-mixtral-8x7b-v15.1.Q4_K_M.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q4_K_M.gguf) | Q4_K_M | 4 | 26.47 GB| 28.97 GB | medium, balanced quality - recommended |
|
142 |
-
| [openbuddy-mixtral-8x7b-v15.1.Q4_K_S.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q4_K_S.gguf) | Q4_K_S | 4 | 26.47 GB| 28.97 GB | small, greater quality loss |
|
143 |
| [openbuddy-mixtral-8x7b-v15.1.Q5_0.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q5_0.gguf) | Q5_0 | 5 | 32.26 GB| 34.76 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
|
144 |
| [openbuddy-mixtral-8x7b-v15.1.Q5_K_M.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q5_K_M.gguf) | Q5_K_M | 5 | 32.26 GB| 34.76 GB | large, very low quality loss - recommended |
|
145 |
-
| [openbuddy-mixtral-8x7b-v15.1.Q5_K_S.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q5_K_S.gguf) | Q5_K_S | 5 | 32.26 GB| 34.76 GB | large, low quality loss - recommended |
|
146 |
| [openbuddy-mixtral-8x7b-v15.1.Q6_K.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q6_K.gguf) | Q6_K | 6 | 38.41 GB| 40.91 GB | very large, extremely low quality loss |
|
147 |
| [openbuddy-mixtral-8x7b-v15.1.Q8_0.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q8_0.gguf) | Q8_0 | 8 | 49.67 GB| 52.17 GB | very large, extremely low quality loss - not recommended |
|
148 |
|
|
|
134 |
| Name | Quant method | Bits | Size | Max RAM required | Use case |
|
135 |
| ---- | ---- | ---- | ---- | ---- | ----- |
|
136 |
| [openbuddy-mixtral-8x7b-v15.1.Q2_K.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q2_K.gguf) | Q2_K | 2 | 15.67 GB| 18.17 GB | smallest, significant quality loss - not recommended for most purposes |
|
|
|
137 |
| [openbuddy-mixtral-8x7b-v15.1.Q3_K_M.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q3_K_M.gguf) | Q3_K_M | 3 | 20.39 GB| 22.89 GB | very small, high quality loss |
|
|
|
138 |
| [openbuddy-mixtral-8x7b-v15.1.Q4_0.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q4_0.gguf) | Q4_0 | 4 | 26.47 GB| 28.97 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
|
139 |
| [openbuddy-mixtral-8x7b-v15.1.Q4_K_M.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q4_K_M.gguf) | Q4_K_M | 4 | 26.47 GB| 28.97 GB | medium, balanced quality - recommended |
|
|
|
140 |
| [openbuddy-mixtral-8x7b-v15.1.Q5_0.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q5_0.gguf) | Q5_0 | 5 | 32.26 GB| 34.76 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
|
141 |
| [openbuddy-mixtral-8x7b-v15.1.Q5_K_M.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q5_K_M.gguf) | Q5_K_M | 5 | 32.26 GB| 34.76 GB | large, very low quality loss - recommended |
|
|
|
142 |
| [openbuddy-mixtral-8x7b-v15.1.Q6_K.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q6_K.gguf) | Q6_K | 6 | 38.41 GB| 40.91 GB | very large, extremely low quality loss |
|
143 |
| [openbuddy-mixtral-8x7b-v15.1.Q8_0.gguf](https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.1-GGUF/blob/main/openbuddy-mixtral-8x7b-v15.1.Q8_0.gguf) | Q8_0 | 8 | 49.67 GB| 52.17 GB | very large, extremely low quality loss - not recommended |
|
144 |
|