Inference error in transformers 4.42.1

#58
by kang1 - opened
Knowledge Engineering Group (KEG) & Data Mining at Tsinghua University org

GLM-4 is now support 4.44.0, you can using with our latest model glm-4-9b-chat

zRzRzRzRzRzRzR changed discussion status to closed

Solution: Downgrade transformers to 4.41.2.

GLM-4 is now support 4.44.0, you can using with our latest model glm-4-9b-chat

This does not resolve any problem, nor does this sentence make sense in terms of grammar.

Sign up or log in to comment