VLLM使用回复出现乱码
#11 opened 2 months ago
by
CRebellion
tokenizer adds default eos_token_id while build_inputs function adds eos_token too!
#9 opened 9 months ago
by
suzhu001
Can I load weights of InternLM into LlamaForCausalLM?
#8 opened 9 months ago
by
czczup
prompt好像还有些问题
1
#6 opened 10 months ago
by
dafen
Prompt template?
1
#4 opened 10 months ago
by
Yhyu13