How much VRAM this model needs?
#10
by
maximxls
- opened
I have a 3090 ti (24GB VRAM) and was wondering what can I run locally. How much VRAM is needed to inference without parameter offloading? Does finetuning need more? If it does, how much so? Also, I guess I can run 3B variant, but can I finetune it?
christopher
changed discussion status to
closed