Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
DeusImperator
's Collections
EXL2 70B for 24GB VRAM
Small models 8bpw EXL2 quants
Mid-range models exl2 quants
Exl2 8bpw MAX quants
Mid-range models exl2 quants
updated
Sep 22
Exl2 quants of mid-range (20-40B) LLM models. Usually around 4-5 BPW
Upvote
-
DeusImperator/magnum-v3-34b_exl2_4.6bpw
Updated
Aug 27
DeusImperator/magnum-v3-34b_exl2_4.6bpw_rpcal_mk2
Updated
Aug 30
•
6
DeusImperator/magnum-32b-v2_exl2_4.7bpw_rpcal_mk2
Text Generation
•
Updated
Aug 27
DeusImperator/magnum-32b-v2_exl2_4.7bpw
Text Generation
•
Updated
Aug 27
•
3
DeusImperator/Star-Command-R-32B-v1_exl2_4.5bpw
Updated
Sep 3
•
11
•
1
DeusImperator/Qwen2.5-32B-Instruct_exl2_4.7bpw_rpcal_mk2
Text Generation
•
Updated
Sep 22
•
7
DeusImperator/Mistral-Small-Instruct-2409_exl2_6.8bpw_rpcal_mk2
Text Generation
•
Updated
Sep 19
•
5
DeusImperator/Big-Tiger-Gemma-27B-v1_exl2_5.6bpw
Updated
Aug 27
•
5
Upvote
-
Share collection
View history
Collection guide
Browse collections