Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
BeaverLegacy
/
Coomand-R-35B-v1-GGUF
like
20
Follow
BeaverLegacy
9
GGUF
Not-For-All-Audiences
Inference Endpoints
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
3
Deploy
Use this model
main
Coomand-R-35B-v1-GGUF
1 contributor
History:
43 commits
TheDrummer
Update README.md
bb4fd0b
verified
7 months ago
.gitattributes
Safe
3.71 kB
Rename Coomand-R-35B-v1-Q2_K.gguf to Coomand-R-35B-v1-OLD_Q2_K.gguf
7 months ago
Coomand-R-35B-v1-OLD_Q2_K.gguf
Safe
13.8 GB
LFS
Rename Coomand-R-35B-v1-Q2_K.gguf to Coomand-R-35B-v1-OLD_Q2_K.gguf
7 months ago
Coomand-R-35B-v1-OLD_Q3_K_M.gguf
Safe
17.6 GB
LFS
Rename Coomand-R-35B-v1-Q3_K_M.gguf to Coomand-R-35B-v1-OLD_Q3_K_M.gguf
7 months ago
Coomand-R-35B-v1-OLD_Q4_K_M.gguf
Safe
21.5 GB
LFS
Rename Coomand-R-35B-v1-Q4_K_M.gguf to Coomand-R-35B-v1-OLD_Q4_K_M.gguf
7 months ago
Coomand-R-35B-v1-OLD_Q5_K_M.gguf
Safe
25 GB
LFS
Rename Coomand-R-35B-v1-Q5_K_M.gguf to Coomand-R-35B-v1-OLD_Q5_K_M.gguf
7 months ago
Coomand-R-35B-v1-OLD_Q6_K.gguf
Safe
28.7 GB
LFS
Rename Coomand-R-35B-v1-Q6_K.gguf to Coomand-R-35B-v1-OLD_Q6_K.gguf
7 months ago
Coomand-R-35B-v1-OLD_Q8_0.gguf
Safe
37.2 GB
LFS
Rename Coomand-R-35B-v1-Q8_0.gguf to Coomand-R-35B-v1-OLD_Q8_0.gguf
7 months ago
Coomand-R-35B-v1-Q2_K.gguf
Safe
13.8 GB
LFS
Upload folder using huggingface_hub
7 months ago
Coomand-R-35B-v1-Q3_K_M.gguf
Safe
17.6 GB
LFS
Upload folder using huggingface_hub
7 months ago
Coomand-R-35B-v1-Q4_K_M.gguf
Safe
21.5 GB
LFS
Upload folder using huggingface_hub
7 months ago
Coomand-R-35B-v1-Q5_K_M.gguf
Safe
25 GB
LFS
Upload folder using huggingface_hub
7 months ago
Coomand-R-35B-v1-Q6_K.gguf
Safe
28.7 GB
LFS
Upload folder using huggingface_hub
7 months ago
Coomand-R-35B-v1-Q8_0.gguf
Safe
37.2 GB
LFS
Upload folder using huggingface_hub
7 months ago
README.md
Safe
13.7 kB
Update README.md
7 months ago