legraphista commited on
Commit
a9f4f84
β€’
1 Parent(s): 259afdc

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -65,9 +65,9 @@ Link: [here](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IM
65
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
66
  | [Llama-3-8B-Instruct-MopeyMule.Q8_0.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q8_0.gguf) | Q8_0 | 8.54GB | βœ… Available | βšͺ Static | πŸ“¦ No
67
  | [Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf) | Q6_K | 6.60GB | βœ… Available | βšͺ Static | πŸ“¦ No
68
- | Llama-3-8B-Instruct-MopeyMule.Q4_K | Q4_K | - | ⏳ Processing | 🟒 IMatrix | -
69
- | Llama-3-8B-Instruct-MopeyMule.Q3_K | Q3_K | - | ⏳ Processing | 🟒 IMatrix | -
70
- | Llama-3-8B-Instruct-MopeyMule.Q2_K | Q2_K | - | ⏳ Processing | 🟒 IMatrix | -
71
 
72
 
73
  ### All Quants
@@ -79,18 +79,18 @@ Link: [here](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IM
79
  | [Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf) | Q6_K | 6.60GB | βœ… Available | βšͺ Static | πŸ“¦ No
80
  | [Llama-3-8B-Instruct-MopeyMule.Q5_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q5_K.gguf) | Q5_K | 5.73GB | βœ… Available | βšͺ Static | πŸ“¦ No
81
  | [Llama-3-8B-Instruct-MopeyMule.Q5_K_S.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q5_K_S.gguf) | Q5_K_S | 5.60GB | βœ… Available | βšͺ Static | πŸ“¦ No
82
- | Llama-3-8B-Instruct-MopeyMule.Q4_K | Q4_K | - | ⏳ Processing | 🟒 IMatrix | -
83
  | Llama-3-8B-Instruct-MopeyMule.Q4_K_S | Q4_K_S | - | ⏳ Processing | 🟒 IMatrix | -
84
  | Llama-3-8B-Instruct-MopeyMule.IQ4_NL | IQ4_NL | - | ⏳ Processing | 🟒 IMatrix | -
85
  | Llama-3-8B-Instruct-MopeyMule.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟒 IMatrix | -
86
- | Llama-3-8B-Instruct-MopeyMule.Q3_K | Q3_K | - | ⏳ Processing | 🟒 IMatrix | -
87
  | Llama-3-8B-Instruct-MopeyMule.Q3_K_L | Q3_K_L | - | ⏳ Processing | 🟒 IMatrix | -
88
  | Llama-3-8B-Instruct-MopeyMule.Q3_K_S | Q3_K_S | - | ⏳ Processing | 🟒 IMatrix | -
89
  | Llama-3-8B-Instruct-MopeyMule.IQ3_M | IQ3_M | - | ⏳ Processing | 🟒 IMatrix | -
90
  | Llama-3-8B-Instruct-MopeyMule.IQ3_S | IQ3_S | - | ⏳ Processing | 🟒 IMatrix | -
91
  | Llama-3-8B-Instruct-MopeyMule.IQ3_XS | IQ3_XS | - | ⏳ Processing | 🟒 IMatrix | -
92
  | Llama-3-8B-Instruct-MopeyMule.IQ3_XXS | IQ3_XXS | - | ⏳ Processing | 🟒 IMatrix | -
93
- | Llama-3-8B-Instruct-MopeyMule.Q2_K | Q2_K | - | ⏳ Processing | 🟒 IMatrix | -
94
  | Llama-3-8B-Instruct-MopeyMule.Q2_K_S | Q2_K_S | - | ⏳ Processing | 🟒 IMatrix | -
95
  | Llama-3-8B-Instruct-MopeyMule.IQ2_M | IQ2_M | - | ⏳ Processing | 🟒 IMatrix | -
96
  | Llama-3-8B-Instruct-MopeyMule.IQ2_S | IQ2_S | - | ⏳ Processing | 🟒 IMatrix | -
 
65
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
66
  | [Llama-3-8B-Instruct-MopeyMule.Q8_0.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q8_0.gguf) | Q8_0 | 8.54GB | βœ… Available | βšͺ Static | πŸ“¦ No
67
  | [Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf) | Q6_K | 6.60GB | βœ… Available | βšͺ Static | πŸ“¦ No
68
+ | [Llama-3-8B-Instruct-MopeyMule.Q4_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q4_K.gguf) | Q4_K | 4.92GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
69
+ | [Llama-3-8B-Instruct-MopeyMule.Q3_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q3_K.gguf) | Q3_K | 4.02GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
70
+ | [Llama-3-8B-Instruct-MopeyMule.Q2_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q2_K.gguf) | Q2_K | 3.18GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
71
 
72
 
73
  ### All Quants
 
79
  | [Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q6_K.gguf) | Q6_K | 6.60GB | βœ… Available | βšͺ Static | πŸ“¦ No
80
  | [Llama-3-8B-Instruct-MopeyMule.Q5_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q5_K.gguf) | Q5_K | 5.73GB | βœ… Available | βšͺ Static | πŸ“¦ No
81
  | [Llama-3-8B-Instruct-MopeyMule.Q5_K_S.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q5_K_S.gguf) | Q5_K_S | 5.60GB | βœ… Available | βšͺ Static | πŸ“¦ No
82
+ | [Llama-3-8B-Instruct-MopeyMule.Q4_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q4_K.gguf) | Q4_K | 4.92GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
83
  | Llama-3-8B-Instruct-MopeyMule.Q4_K_S | Q4_K_S | - | ⏳ Processing | 🟒 IMatrix | -
84
  | Llama-3-8B-Instruct-MopeyMule.IQ4_NL | IQ4_NL | - | ⏳ Processing | 🟒 IMatrix | -
85
  | Llama-3-8B-Instruct-MopeyMule.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟒 IMatrix | -
86
+ | [Llama-3-8B-Instruct-MopeyMule.Q3_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q3_K.gguf) | Q3_K | 4.02GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
87
  | Llama-3-8B-Instruct-MopeyMule.Q3_K_L | Q3_K_L | - | ⏳ Processing | 🟒 IMatrix | -
88
  | Llama-3-8B-Instruct-MopeyMule.Q3_K_S | Q3_K_S | - | ⏳ Processing | 🟒 IMatrix | -
89
  | Llama-3-8B-Instruct-MopeyMule.IQ3_M | IQ3_M | - | ⏳ Processing | 🟒 IMatrix | -
90
  | Llama-3-8B-Instruct-MopeyMule.IQ3_S | IQ3_S | - | ⏳ Processing | 🟒 IMatrix | -
91
  | Llama-3-8B-Instruct-MopeyMule.IQ3_XS | IQ3_XS | - | ⏳ Processing | 🟒 IMatrix | -
92
  | Llama-3-8B-Instruct-MopeyMule.IQ3_XXS | IQ3_XXS | - | ⏳ Processing | 🟒 IMatrix | -
93
+ | [Llama-3-8B-Instruct-MopeyMule.Q2_K.gguf](https://huggingface.co/legraphista/Llama-3-8B-Instruct-MopeyMule-IMat-GGUF/blob/main/Llama-3-8B-Instruct-MopeyMule.Q2_K.gguf) | Q2_K | 3.18GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
94
  | Llama-3-8B-Instruct-MopeyMule.Q2_K_S | Q2_K_S | - | ⏳ Processing | 🟒 IMatrix | -
95
  | Llama-3-8B-Instruct-MopeyMule.IQ2_M | IQ2_M | - | ⏳ Processing | 🟒 IMatrix | -
96
  | Llama-3-8B-Instruct-MopeyMule.IQ2_S | IQ2_S | - | ⏳ Processing | 🟒 IMatrix | -