Quick question is this a 2048 or 4096 model for context size in Ooba? Using Ctransformers?
#1
by
Goldenblood56
- opened
It defaults to 4096 but I don't know if that is an error or not?
This is a Llama 2, so it's 4096. If I put 2048 in the README, that was a readme error