Converting back to Mistral/vLLM format

#4
by RonanMcGovern - opened

Many thanks for having made this version.

Is there a way to convert back to the mistral/vllm format?

The reason I ask is because I have a fine-tuned transformers model that I want to inference with vLLM. Thanks.

Any update on this? We're very interested too

Unofficial Mistral Community org

I wrote this script which seems to work: https://github.com/spring-anth/transform_pixtral/blob/main/convert_hf_transformers_pixtral_model_to_vllm_compatible_version.py, I can now host my finetuned model with vLLM. If you find any mistakes or have improvement suggestions please let me know :)

Sign up or log in to comment