Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
bartowski
/
magnum-v4-72b-GGUF
like
0
Text Generation
GGUF
English
chat
Inference Endpoints
imatrix
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Deploy
Use this model
main
magnum-v4-72b-GGUF
1 contributor
History:
24 commits
bartowski
Update metadata with huggingface_hub
ca33bb8
verified
16 days ago
magnum-v4-72b-Q5_K_M
Upload folder using huggingface_hub
16 days ago
magnum-v4-72b-Q5_K_S
Upload folder using huggingface_hub
16 days ago
magnum-v4-72b-Q6_K
Upload folder using huggingface_hub
16 days ago
magnum-v4-72b-Q8_0
Upload folder using huggingface_hub
16 days ago
.gitattributes
Safe
3.33 kB
Upload magnum-v4-72b.imatrix with huggingface_hub
16 days ago
README.md
Safe
8.62 kB
Update metadata with huggingface_hub
16 days ago
magnum-v4-72b-IQ1_M.gguf
Safe
23.7 GB
LFS
Upload magnum-v4-72b-IQ1_M.gguf with huggingface_hub
16 days ago
magnum-v4-72b-IQ2_M.gguf
Safe
29.3 GB
LFS
Upload magnum-v4-72b-IQ2_M.gguf with huggingface_hub
16 days ago
magnum-v4-72b-IQ2_XS.gguf
Safe
27.1 GB
LFS
Upload magnum-v4-72b-IQ2_XS.gguf with huggingface_hub
16 days ago
magnum-v4-72b-IQ2_XXS.gguf
Safe
25.5 GB
LFS
Upload magnum-v4-72b-IQ2_XXS.gguf with huggingface_hub
16 days ago
magnum-v4-72b-IQ3_M.gguf
Safe
35.5 GB
LFS
Upload magnum-v4-72b-IQ3_M.gguf with huggingface_hub
16 days ago
magnum-v4-72b-IQ3_XXS.gguf
Safe
31.8 GB
LFS
Upload magnum-v4-72b-IQ3_XXS.gguf with huggingface_hub
16 days ago
magnum-v4-72b-IQ4_XS.gguf
Safe
39.7 GB
LFS
Upload magnum-v4-72b-IQ4_XS.gguf with huggingface_hub
16 days ago
magnum-v4-72b-Q2_K.gguf
Safe
29.8 GB
LFS
Upload magnum-v4-72b-Q2_K.gguf with huggingface_hub
16 days ago
magnum-v4-72b-Q2_K_L.gguf
Safe
31 GB
LFS
Upload magnum-v4-72b-Q2_K_L.gguf with huggingface_hub
16 days ago
magnum-v4-72b-Q3_K_L.gguf
Safe
39.5 GB
LFS
Upload magnum-v4-72b-Q3_K_L.gguf with huggingface_hub
16 days ago
magnum-v4-72b-Q3_K_M.gguf
Safe
37.7 GB
LFS
Upload magnum-v4-72b-Q3_K_M.gguf with huggingface_hub
16 days ago
magnum-v4-72b-Q3_K_S.gguf
Safe
34.5 GB
LFS
Upload magnum-v4-72b-Q3_K_S.gguf with huggingface_hub
16 days ago
magnum-v4-72b-Q3_K_XL.gguf
Safe
40.6 GB
LFS
Upload magnum-v4-72b-Q3_K_XL.gguf with huggingface_hub
16 days ago
magnum-v4-72b-Q4_0.gguf
Safe
41.4 GB
LFS
Upload magnum-v4-72b-Q4_0.gguf with huggingface_hub
16 days ago
magnum-v4-72b-Q4_K_M.gguf
Safe
47.4 GB
LFS
Upload magnum-v4-72b-Q4_K_M.gguf with huggingface_hub
16 days ago
magnum-v4-72b-Q4_K_S.gguf
Safe
43.9 GB
LFS
Upload magnum-v4-72b-Q4_K_S.gguf with huggingface_hub
16 days ago
magnum-v4-72b.imatrix
25.2 MB
LFS
Upload magnum-v4-72b.imatrix with huggingface_hub
16 days ago