vllm不支持Qwen2ForRewardModel
#4
by
vinf
- opened
Model architectures ['Qwen2ForRewardModel'] are not supported for now, how to solve it, help help help
Sorry, but vllm does not support this at the moment. We hope it will be supported in the future.
Zhenru
changed discussion status to
closed
Sorry, but vllm does not support this at the moment. We hope it will be supported in the future.
Is there any other way to deploy this model besides using transformers?