context window size?
#10
by
ichigoberry
- opened
I couldn't find info on the context window size, is there a size we can rely on?
As far as I can tell, this is only made for 8k context, granted the vocabulary is massive at 256k so that helps reduce the amount of tokens code takes up but still..
Hi @ichigoberry , All the models have the same 8k tokens context size which is also mentioned in this doc. Thank you.