legraphista
commited on
Commit
β’
a9f4f84
1
Parent(s):
259afdc
Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -65,9 +65,9 @@ Link: [here](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IM
|
|
65 |
| -------- | ---------- | --------- | ------ | ------------ | -------- |
|
66 |
| [Llama-3-8B-Instruct-MopeyMule.Q8_0.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q8_0.gguf) | Q8_0 | 8.54GB | β
Available | βͺ Static | π¦ No
|
67 |
| [Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf) | Q6_K | 6.60GB | β
Available | βͺ Static | π¦ No
|
68 |
-
| Llama-3-8B-Instruct-MopeyMule.Q4_K | Q4_K |
|
69 |
-
| Llama-3-8B-Instruct-MopeyMule.Q3_K | Q3_K |
|
70 |
-
| Llama-3-8B-Instruct-MopeyMule.Q2_K | Q2_K |
|
71 |
|
72 |
|
73 |
### All Quants
|
@@ -79,18 +79,18 @@ Link: [here](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IM
|
|
79 |
| [Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf) | Q6_K | 6.60GB | β
Available | βͺ Static | π¦ No
|
80 |
| [Llama-3-8B-Instruct-MopeyMule.Q5_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q5_K.gguf) | Q5_K | 5.73GB | β
Available | βͺ Static | π¦ No
|
81 |
| [Llama-3-8B-Instruct-MopeyMule.Q5_K_S.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q5_K_S.gguf) | Q5_K_S | 5.60GB | β
Available | βͺ Static | π¦ No
|
82 |
-
| Llama-3-8B-Instruct-MopeyMule.Q4_K | Q4_K |
|
83 |
| Llama-3-8B-Instruct-MopeyMule.Q4_K_S | Q4_K_S | - | β³ Processing | π’ IMatrix | -
|
84 |
| Llama-3-8B-Instruct-MopeyMule.IQ4_NL | IQ4_NL | - | β³ Processing | π’ IMatrix | -
|
85 |
| Llama-3-8B-Instruct-MopeyMule.IQ4_XS | IQ4_XS | - | β³ Processing | π’ IMatrix | -
|
86 |
-
| Llama-3-8B-Instruct-MopeyMule.Q3_K | Q3_K |
|
87 |
| Llama-3-8B-Instruct-MopeyMule.Q3_K_L | Q3_K_L | - | β³ Processing | π’ IMatrix | -
|
88 |
| Llama-3-8B-Instruct-MopeyMule.Q3_K_S | Q3_K_S | - | β³ Processing | π’ IMatrix | -
|
89 |
| Llama-3-8B-Instruct-MopeyMule.IQ3_M | IQ3_M | - | β³ Processing | π’ IMatrix | -
|
90 |
| Llama-3-8B-Instruct-MopeyMule.IQ3_S | IQ3_S | - | β³ Processing | π’ IMatrix | -
|
91 |
| Llama-3-8B-Instruct-MopeyMule.IQ3_XS | IQ3_XS | - | β³ Processing | π’ IMatrix | -
|
92 |
| Llama-3-8B-Instruct-MopeyMule.IQ3_XXS | IQ3_XXS | - | β³ Processing | π’ IMatrix | -
|
93 |
-
| Llama-3-8B-Instruct-MopeyMule.Q2_K | Q2_K |
|
94 |
| Llama-3-8B-Instruct-MopeyMule.Q2_K_S | Q2_K_S | - | β³ Processing | π’ IMatrix | -
|
95 |
| Llama-3-8B-Instruct-MopeyMule.IQ2_M | IQ2_M | - | β³ Processing | π’ IMatrix | -
|
96 |
| Llama-3-8B-Instruct-MopeyMule.IQ2_S | IQ2_S | - | β³ Processing | π’ IMatrix | -
|
|
|
65 |
| -------- | ---------- | --------- | ------ | ------------ | -------- |
|
66 |
| [Llama-3-8B-Instruct-MopeyMule.Q8_0.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q8_0.gguf) | Q8_0 | 8.54GB | β
Available | βͺ Static | π¦ No
|
67 |
| [Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf) | Q6_K | 6.60GB | β
Available | βͺ Static | π¦ No
|
68 |
+
| [Llama-3-8B-Instruct-MopeyMule.Q4_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q4_K.gguf) | Q4_K | 4.92GB | β
Available | π’ IMatrix | π¦ No
|
69 |
+
| [Llama-3-8B-Instruct-MopeyMule.Q3_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q3_K.gguf) | Q3_K | 4.02GB | β
Available | π’ IMatrix | π¦ No
|
70 |
+
| [Llama-3-8B-Instruct-MopeyMule.Q2_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q2_K.gguf) | Q2_K | 3.18GB | β
Available | π’ IMatrix | π¦ No
|
71 |
|
72 |
|
73 |
### All Quants
|
|
|
79 |
| [Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf) | Q6_K | 6.60GB | β
Available | βͺ Static | π¦ No
|
80 |
| [Llama-3-8B-Instruct-MopeyMule.Q5_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q5_K.gguf) | Q5_K | 5.73GB | β
Available | βͺ Static | π¦ No
|
81 |
| [Llama-3-8B-Instruct-MopeyMule.Q5_K_S.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q5_K_S.gguf) | Q5_K_S | 5.60GB | β
Available | βͺ Static | π¦ No
|
82 |
+
| [Llama-3-8B-Instruct-MopeyMule.Q4_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q4_K.gguf) | Q4_K | 4.92GB | β
Available | π’ IMatrix | π¦ No
|
83 |
| Llama-3-8B-Instruct-MopeyMule.Q4_K_S | Q4_K_S | - | β³ Processing | π’ IMatrix | -
|
84 |
| Llama-3-8B-Instruct-MopeyMule.IQ4_NL | IQ4_NL | - | β³ Processing | π’ IMatrix | -
|
85 |
| Llama-3-8B-Instruct-MopeyMule.IQ4_XS | IQ4_XS | - | β³ Processing | π’ IMatrix | -
|
86 |
+
| [Llama-3-8B-Instruct-MopeyMule.Q3_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q3_K.gguf) | Q3_K | 4.02GB | β
Available | π’ IMatrix | π¦ No
|
87 |
| Llama-3-8B-Instruct-MopeyMule.Q3_K_L | Q3_K_L | - | β³ Processing | π’ IMatrix | -
|
88 |
| Llama-3-8B-Instruct-MopeyMule.Q3_K_S | Q3_K_S | - | β³ Processing | π’ IMatrix | -
|
89 |
| Llama-3-8B-Instruct-MopeyMule.IQ3_M | IQ3_M | - | β³ Processing | π’ IMatrix | -
|
90 |
| Llama-3-8B-Instruct-MopeyMule.IQ3_S | IQ3_S | - | β³ Processing | π’ IMatrix | -
|
91 |
| Llama-3-8B-Instruct-MopeyMule.IQ3_XS | IQ3_XS | - | β³ Processing | π’ IMatrix | -
|
92 |
| Llama-3-8B-Instruct-MopeyMule.IQ3_XXS | IQ3_XXS | - | β³ Processing | π’ IMatrix | -
|
93 |
+
| [Llama-3-8B-Instruct-MopeyMule.Q2_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q2_K.gguf) | Q2_K | 3.18GB | β
Available | π’ IMatrix | π¦ No
|
94 |
| Llama-3-8B-Instruct-MopeyMule.Q2_K_S | Q2_K_S | - | β³ Processing | π’ IMatrix | -
|
95 |
| Llama-3-8B-Instruct-MopeyMule.IQ2_M | IQ2_M | - | β³ Processing | π’ IMatrix | -
|
96 |
| Llama-3-8B-Instruct-MopeyMule.IQ2_S | IQ2_S | - | β³ Processing | π’ IMatrix | -
|