Has this been converted with the recent llama.cpp patches applied?
#1
by
FlareRebellion
- opened
I'm talking about: https://github.com/ggerganov/llama.cpp/pull/8676
Coherence seems to break down at larger contexts without this.
Yes!