Inference error in transformers 4.42.1
#58
by
kang1
- opened
The related issue is: https://github.com/huggingface/transformers/issues/31678
GLM-4 is now support 4.44.0, you can using with our latest model glm-4-9b-chat
zRzRzRzRzRzRzR
changed discussion status to
closed
Solution: Downgrade transformers to 4.41.2.
GLM-4 is now support 4.44.0, you can using with our latest model glm-4-9b-chat
This does not resolve any problem, nor does this sentence make sense in terms of grammar.