Context window size
#5
by
nickaubert
- opened
Does anyone know what the size of the context window of the new 8x22B model is?
I believe Mixtral 8x7B has a 8k context, but the brand new Mistral-7B-v0.1 has 32k context. It will be interesting if this new 7x22B model has a larger context as well.
It has 65k context size.
65536 tokens
65k is interesting! This is one of the large context windows for an open weights LLM.
nickaubert
changed discussion status to
closed