GPU Requirement
#9
by
abrehmaaan
- opened
I have an Amazon EC2 instance having these specs:
Instance Size: g5.2xlarge
GPU: 1
GPU Memory (GiB): 24
vCPUs: 8
Memory (GiB): 32
Instance Storage (GB): 1 x 450 NVMe SSD
Network Bandwidth (Gbps): Up to 10
EBS Bandwidth (Gbps): Up to 3.5
Can I run this Llama-2-7B-GPTQ version on it? Does the model require more specs? Are there any other open-source LLMs that can be used in these specifications?
@TheBloke Please reply
Yes definitely. You could probably run a 13b gptq model as well.