Wrong "max_position_embeddings" value?
#1
by
AaronFeng753
- opened
On the model card:
Model Details
...
Context length: 128K
But the config.json shows:
"max_position_embeddings": 8192,
cc @shivi (model just got released though lol)
Hi, our implementation is based on the Llama implementation which materializes this huge buffer which would not be feasible for 128k context. The model would definitely support 128k context with a better implementation.
Hope that clarifies the doubt. Closing the issue for now. Feel free to open in case of other doubts.
shivi
changed discussion status to
closed