Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
ikawrakow
/
various-2bit-sota-gguf
like
80
GGUF
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
13
Deploy
Use this model
ffe00e8
various-2bit-sota-gguf
1 contributor
History:
11 commits
ikawrakow
Update README.md
ffe00e8
11 months ago
.gitattributes
Safe
1.56 kB
Adding first set of models
11 months ago
README.md
Safe
384 Bytes
Update README.md
11 months ago
llama-v2-13b-2.17bpw.gguf
Safe
3.54 GB
LFS
Adding first set of models
11 months ago
llama-v2-13b-2.39bpw.gguf
Safe
3.89 GB
LFS
Adding 2.31-bpw base quantized models
11 months ago
llama-v2-70b-2.12bpw.gguf
Safe
18.3 GB
LFS
Adding more
11 months ago
llama-v2-70b-2.36bpw.gguf
Safe
20.3 GB
LFS
Adding 2.31-bpw base quantized models
11 months ago
llama-v2-7b-2.20bpw.gguf
Safe
1.85 GB
LFS
Adding first set of models
11 months ago
llama-v2-7b-2.42bpw.gguf
Safe
2.03 GB
LFS
Adding 2.31-bpw base quantized models
11 months ago
mistral-7b-2.20bpw.gguf
Safe
1.99 GB
LFS
Adding first set of models
11 months ago
mistral-7b-2.43bpw.gguf
Safe
2.2 GB
LFS
Adding 2.31-bpw base quantized models
11 months ago
mixtral-8x7b-2.10bpw.gguf
Safe
12.3 GB
LFS
Adding Mixtral-8x7b
11 months ago
mixtral-8x7b-2.34bpw.gguf
Safe
13.7 GB
LFS
Adding 2.31-bpw base quantized models
11 months ago
mixtral-instruct-8x7b-2.10bpw.gguf
Safe
12.3 GB
LFS
Adding Mixtral-instruct-8x7b
11 months ago
nous-hermes-2-10.7b-2.18bpw.gguf
Safe
2.92 GB
LFS
Adding Nous-Hermes-2-SOLAR-10.7B 2-bit quants
11 months ago
nous-hermes-2-10.7b-2.70bpw.gguf
Safe
3.62 GB
LFS
Adding Nous-Hermes-2-SOLAR-10.7B 2-bit quants
11 months ago
nous-hermes-2-34b-2.16bpw.gguf
Safe
9.31 GB
LFS
Adding Nous-Hermes-2-Yi-34B 2-bit quants
11 months ago
nous-hermes-2-34b-2.69bpw.gguf
Safe
11.6 GB
LFS
Adding Nous-Hermes-2-Yi-34B 2-bit quants
11 months ago
rocket-3b-2.31bpw.gguf
Safe
808 MB
LFS
Adding Rocket-3b 2-bit quants
11 months ago
rocket-3b-2.76bpw.gguf
Safe
967 MB
LFS
Adding Rocket-3b 2-bit quants
11 months ago