Update README.md
#17
by
dkleine
- opened
- changed to 4k context length (Phi-3-mini-4k-instruct)
The information about the context length in the model card are contradicting. Does this GGUF model have a context length of either 4k or 128k tokens?
nguyenbh
changed pull request status to
merged