TypeError: BFloat16 is not supported on MPS

#11
by hiepsiga - opened

Hi there,

I appreciate your project.

I tried to install and run the example code as below

from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
tokenizer = AutoTokenizer.from_pretrained("THUDM/LongWriter-glm4-9b", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("THUDM/LongWriter-glm4-9b", torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto")
model = model.eval()
query = "Write a 10000-word China travel guide"
response, history = model.chat(tokenizer, query, history=[], max_new_tokens=32768, temperature=0.5)
print(response)

But it shows error TypeError: BFloat16 is not supported on MPS

I think it might have something to do with the version of MacOS I'm running, which is Monterey 12.7.2. It could also have something to do with the Intel x86 CPU, since we still use iMacs with Intel chips. Because of those limitations, I was only allowed to install torch 2.2.2, since 2.3 and above do not run on x86 architecture (https://github.com/pytorch/pytorch/issues/114602)

Do you have any way I can get past this?

Below is the specific log of the error:

Traceback (most recent call last):
  File "/Users/hongquan2512/Downloads/LongWriter-main/THUDM/LongWriter-glm4-9b/demo.py", line 4, in <module>
    model = AutoModelForCausalLM.from_pretrained("THUDM/LongWriter-glm4-9b", torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto")
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hongquan2512/miniforge3/lib/python3.12/site-packages/transformers/models/auto/auto_factory.py", line 559, in from_pretrained
    return model_class.from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hongquan2512/miniforge3/lib/python3.12/site-packages/transformers/modeling_utils.py", line 4225, in from_pretrained
    ) = cls._load_pretrained_model(
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hongquan2512/miniforge3/lib/python3.12/site-packages/transformers/modeling_utils.py", line 4728, in _load_pretrained_model
    new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model(
                                                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hongquan2512/miniforge3/lib/python3.12/site-packages/transformers/modeling_utils.py", line 993, in _load_state_dict_into_meta_model
    set_module_tensor_to_device(model, param_name, param_device, **set_module_kwargs)
  File "/Users/hongquan2512/miniforge3/lib/python3.12/site-packages/accelerate/utils/modeling.py", line 329, in set_module_tensor_to_device
    new_value = value.to(device)
                ^^^^^^^^^^^^^^^^
TypeError: BFloat16 is not supported on MPS

Kind regards,

Sign up or log in to comment