Great work, but why only 2048 context length?

#4
by SamuelAzran - opened

8k is the new 2k ;) 8k allow many more use-cases that simply cannot work without the extra context.

OpenChat org

Sorry for the inconvenience, we plan to experiment with longer context base models in the future.

Sign up or log in to comment