granite-7b-lab-GGUF / config.json
xukai92's picture
feat: 4-bit quantized model
1344fef
raw
history blame contribute delete
30 Bytes
{
"model_type": "llama"
}