Update tokenization_chatglm.py

#72

https://github.com/huggingface/transformers/blob/8f2b6d5e3dcf40ab0d01f3c8117d1df09e465616/src/transformers/tokenization_utils_base.py#L43

TensorType should come from transformers.utils instead of torch. Otherwise, the tokenizer can't be used alone in an environment without PyTorch installed, where we only need to count tokens.

tokenization_chatglm.py 有何用途,

zRzRzRzRzRzRzR changed pull request status to merged

Sign up or log in to comment