AndreyBest
commited on
Commit
•
ea7fab7
1
Parent(s):
4c66790
Update README.md
Browse files
README.md
CHANGED
@@ -240,21 +240,21 @@ Answer:
|
|
240 |
| Name | Quant method | Size | Memory (RAM, vRAM) required (for full context of 32k tokens) |
|
241 |
| ---- | ---- | ---- | ---- |
|
242 |
| [granite-3b-code-instruct.Q2_K.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q2_K.gguf) | Q2_K | 1.34 GB | 4.68 GB |
|
243 |
-
| [granite-3b-code-instruct.Q3_K_S.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q3_K_S.gguf) | Q3_K_S | 1.55 GB |
|
244 |
-
| [granite-3b-code-instruct.Q3_K_M.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q3_K_M.gguf) | Q3_K_M | 1.73 GB |
|
245 |
-
| [granite-3b-code-instruct.Q3_K_L.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q3_K_L.gguf) | Q3_K_L | 1.88 GB |
|
246 |
-
| [granite-3b-code-instruct.Q4_0.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q4_0.gguf) | Q4_0 | 2.00 GB |
|
247 |
-
| [granite-3b-code-instruct.Q4_K_S.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q4_K_S.gguf) | Q4_K_S | 2.01 GB |
|
248 |
-
| [granite-3b-code-instruct.Q4_K_M.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q4_K_M.gguf) | Q4_K_M | 2.13 GB |
|
249 |
-
| [granite-3b-code-instruct.Q4_K.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q4_K.gguf) | Q4_K | 2.13 GB |
|
250 |
-
| [granite-3b-code-instruct.Q4_1.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q4_1.gguf) | Q4_1 | 2.21 GB |
|
251 |
-
| [granite-3b-code-instruct.Q5_0.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q5_0.gguf) | Q5_0 | 2.42 GB |
|
252 |
-
| [granite-3b-code-instruct.Q5_K_S.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q5_K_S.gguf) | Q5_K_S | 2.42 GB |
|
253 |
-
| [granite-3b-code-instruct.Q5_K_M.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q5_K_M.gguf) | Q5_K_M | 2.49 GB |
|
254 |
-
| [granite-3b-code-instruct.Q5_K.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q5_K.gguf) | Q5_K | 2.49 GB |
|
255 |
-
| [granite-3b-code-instruct.Q5_1.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q5_1.gguf) | Q5_1 | 2.63 GB |
|
256 |
-
| [granite-3b-code-instruct.Q6_K.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q6_K.gguf) | Q6_K | 2.86 GB |
|
257 |
-
| [granite-3b-code-instruct.Q8_0.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q8_0.gguf) | Q8_0 | 3.71 GB |
|
258 |
| [granite-3b-code-instruct.f16.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.f16.gguf) | f16 | 6.97 GB | 4.68 GB |
|
259 |
|
260 |
## Disclaimer
|
|
|
240 |
| Name | Quant method | Size | Memory (RAM, vRAM) required (for full context of 32k tokens) |
|
241 |
| ---- | ---- | ---- | ---- |
|
242 |
| [granite-3b-code-instruct.Q2_K.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q2_K.gguf) | Q2_K | 1.34 GB | 4.68 GB |
|
243 |
+
| [granite-3b-code-instruct.Q3_K_S.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q3_K_S.gguf) | Q3_K_S | 1.55 GB | ? |
|
244 |
+
| [granite-3b-code-instruct.Q3_K_M.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q3_K_M.gguf) | Q3_K_M | 1.73 GB | ? |
|
245 |
+
| [granite-3b-code-instruct.Q3_K_L.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q3_K_L.gguf) | Q3_K_L | 1.88 GB | ? |
|
246 |
+
| [granite-3b-code-instruct.Q4_0.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q4_0.gguf) | Q4_0 | 2.00 GB | ? |
|
247 |
+
| [granite-3b-code-instruct.Q4_K_S.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q4_K_S.gguf) | Q4_K_S | 2.01 GB | ? |
|
248 |
+
| [granite-3b-code-instruct.Q4_K_M.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q4_K_M.gguf) | Q4_K_M | 2.13 GB | ? |
|
249 |
+
| [granite-3b-code-instruct.Q4_K.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q4_K.gguf) | Q4_K | 2.13 GB | ? |
|
250 |
+
| [granite-3b-code-instruct.Q4_1.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q4_1.gguf) | Q4_1 | 2.21 GB | ? |
|
251 |
+
| [granite-3b-code-instruct.Q5_0.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q5_0.gguf) | Q5_0 | 2.42 GB | ? |
|
252 |
+
| [granite-3b-code-instruct.Q5_K_S.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q5_K_S.gguf) | Q5_K_S | 2.42 GB | ? |
|
253 |
+
| [granite-3b-code-instruct.Q5_K_M.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q5_K_M.gguf) | Q5_K_M | 2.49 GB | ? |
|
254 |
+
| [granite-3b-code-instruct.Q5_K.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q5_K.gguf) | Q5_K | 2.49 GB | ? |
|
255 |
+
| [granite-3b-code-instruct.Q5_1.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q5_1.gguf) | Q5_1 | 2.63 GB | ? |
|
256 |
+
| [granite-3b-code-instruct.Q6_K.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q6_K.gguf) | Q6_K | 2.86 GB | ? |
|
257 |
+
| [granite-3b-code-instruct.Q8_0.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.Q8_0.gguf) | Q8_0 | 3.71 GB | ? |
|
258 |
| [granite-3b-code-instruct.f16.gguf](https://huggingface.co/SanctumAI/granite-3b-code-instruct-GGUF/blob/main/granite-3b-code-instruct.f16.gguf) | f16 | 6.97 GB | 4.68 GB |
|
259 |
|
260 |
## Disclaimer
|