Only 8k context?

#3
by Karlota - opened

Hello, I'm using the ColabKobold GPU to run this model, and SillyTavern. when connecting to the API, the settings show that the context limit is 8k just like the original openchat, any suggestion helps

Sign up or log in to comment