Computer specifications

#1
by JOHNNY2020 - opened

Will this run on a CPU, if so what would be the RAM and Processor requirements ?
Can it be run run offline ?

Cognitive Computations org
edited Sep 3, 2023

Like any llama models, this too should run with llama.cpp, you would need to have 32GB of RAM for 4bit quantization (a bit less than that would be enough, probably 24GB would do).
CPU is not terribly important, as long as if one of the newer ones, you will be limited by your RAM speed more than by your CPU.

JOHNNY2020 changed discussion status to closed

Thank you !

Sign up or log in to comment