legraphista commited on
Commit
60a9b2b
β€’
1 Parent(s): efec167

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -79,7 +79,7 @@ Link: [here](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/mai
79
  | [glm-4-9b-chat.Q5_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q5_K.gguf) | Q5_K | 7.14GB | βœ… Available | βšͺ Static | πŸ“¦ No
80
  | [glm-4-9b-chat.Q5_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q5_K_S.gguf) | Q5_K_S | 6.69GB | βœ… Available | βšͺ Static | πŸ“¦ No
81
  | [glm-4-9b-chat.Q4_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q4_K.gguf) | Q4_K | 6.25GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
82
- | glm-4-9b-chat.Q4_K_S | Q4_K_S | - | ⏳ Processing | 🟒 IMatrix | -
83
  | glm-4-9b-chat.IQ4_NL | IQ4_NL | - | ⏳ Processing | 🟒 IMatrix | -
84
  | glm-4-9b-chat.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟒 IMatrix | -
85
  | [glm-4-9b-chat.Q3_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q3_K.gguf) | Q3_K | 5.06GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
 
79
  | [glm-4-9b-chat.Q5_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q5_K.gguf) | Q5_K | 7.14GB | βœ… Available | βšͺ Static | πŸ“¦ No
80
  | [glm-4-9b-chat.Q5_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q5_K_S.gguf) | Q5_K_S | 6.69GB | βœ… Available | βšͺ Static | πŸ“¦ No
81
  | [glm-4-9b-chat.Q4_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q4_K.gguf) | Q4_K | 6.25GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
82
+ | [glm-4-9b-chat.Q4_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q4_K_S.gguf) | Q4_K_S | 5.75GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
83
  | glm-4-9b-chat.IQ4_NL | IQ4_NL | - | ⏳ Processing | 🟒 IMatrix | -
84
  | glm-4-9b-chat.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟒 IMatrix | -
85
  | [glm-4-9b-chat.Q3_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q3_K.gguf) | Q3_K | 5.06GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No