Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
batmac
/
gpt2-gguf
like
1
GGUF
Inference Endpoints
Model card
Files
Files and versions
Community
Deploy
Use this model
main
gpt2-gguf
1 contributor
History:
3 commits
batmac
Upload README.md with huggingface_hub
c1da8c3
verified
6 months ago
.gitattributes
Safe
1.84 kB
Upload folder using huggingface_hub
6 months ago
README.md
Safe
19 Bytes
Upload README.md with huggingface_hub
6 months ago
ggml-model-IQ3_M.gguf
Safe
94.2 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-IQ3_S.gguf
Safe
90.1 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-IQ3_XS.gguf
Safe
89.2 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-IQ3_XXS.gguf
Safe
83 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-IQ4_NL.gguf
Safe
107 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-IQ4_XS.gguf
Safe
103 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q2_K.gguf
Safe
81.2 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q3_K.gguf
Safe
97.7 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q3_K_L.gguf
Safe
102 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q3_K_M.gguf
Safe
97.7 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q3_K_S.gguf
Safe
90.1 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q4_0.gguf
Safe
107 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q4_1.gguf
Safe
114 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q4_K.gguf
Safe
113 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q4_K_M.gguf
Safe
113 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q4_K_S.gguf
Safe
107 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q5_0.gguf
Safe
122 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q5_1.gguf
Safe
130 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q5_K.gguf
Safe
127 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q5_K_M.gguf
Safe
127 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q5_K_S.gguf
Safe
122 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q6_K.gguf
Safe
138 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-Q8_0.gguf
Safe
178 MB
LFS
Upload folder using huggingface_hub
6 months ago
ggml-model-f16.gguf
330 MB
LFS
Upload folder using huggingface_hub
6 months ago