VLLM使用回复出现乱码

#11
by CRebellion - opened

'''
Processed prompts: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 2.83it/s, est. speed input: 59.49 toks/s, output: 45.32 toks/s]Prompt: '<|im_start|>system\n你是一个智能机器人<|im_end|>\n<|im_start|>user\n你好<|im_end|>\n<|im_start|>assistant\n', Generated text: '致致致致致致致致致致致致致致致致'
'''

实例化vllm代码
SamplingParams(temperature=0.1, top_p=0.95)
model = LLM(model=model_path, tokenizer=model_path, trust_remote_code=True, tensor_parallel_size=3, gpu_memory_utilization=0.9)

'''

Sign up or log in to comment