Error with Anaconda with Pycharm

#205
by yiwens - opened

Hi,

I'm running Pycharm with the conda interpreter, and I have cuda installed and working properly. In the conda environment I installed pytorch, huggingface accelerate, and transformers packages. And my access token works as well.

Then I try to run the default example that's on the https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct page, and I'm getting an error saying
File "C:\Users\ethan\anaconda3\envs\CUDA\Lib\site-packages\transformers\pipelines\text_generation.py", line 233, in preprocess
prefix + prompt_text,
~~~~~~~^~~~~~~~~~~~~
TypeError: can only concatenate str (not "dict") to str

Can anyone help?

Thanks,
Ethan

model_id = "meta-llama/Meta-Llama-3-8B-Instruct"

pipe = pipeline(
    "text-generation",
    model=model_id,
    model_kwargs={"torch_dtype": torch.bfloat16},
    device=0,  # 'cuda' for GPU, 'cpu' for CPU
)

messages = [
    {"role": "system", "content": "You are a pirate chatbot who always responds in pirate speak!"},
    {"role": "user", "content": "Who are you?"},
]

# Combine messages into a single string prompt
prompt = "\n".join([f"{msg['role']}: {msg['content']}" for msg in messages])

# Get the EOS token ID
eos_token_id = pipe.tokenizer.eos_token_id

outputs = pipe(
    prompt,
    max_new_tokens=256,
    eos_token_id=eos_token_id,
    do_sample=True,
    temperature=0.6,
    top_p=0.9,
)

assistant_response = outputs[0]["generated_text"]
print(assistant_response)

could you please try the above code?

Sign up or log in to comment