Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Aeala
/
Alpaca-elina-65b-4bit
like
7
Text Generation
Transformers
PyTorch
llama
text-generation-inference
Inference Endpoints
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
Alpaca-elina-65b-2bit on 24GB VRAM?
#1
by
Zicara
- opened
May 5, 2023
Discussion
Zicara
May 5, 2023
Can a 2bit (bad output) fit in 24GB VRAM?
Zicara
changed discussion status to
closed
Sep 29, 2023
Edit
Preview
Upload images, audio, and videos by dragging in the text input, pasting, or
clicking here
.
Tap or paste here to upload images
Comment
·
Sign up
or
log in
to comment