Could you tell me how to transfer a Phi-3 model (safetensors) to its onnx?
Hi dear!
I am a researcher from UW. And I have a fine-tuned model from Phi-3-mini-128k-instruct. I just wonder how to transfer it to onnx?
Your help is very essential to me.
Thanks!
You can use ONNX Runtime GenAI's model builder to quickly convert your fine-tuned Phi-3-mini-128k-instruct
model to optimized and quantized ONNX models. This example should work for your scenario.
Hi
@kvaishnavi
Thank you for your help!
What about more than two safetensors? How to convert many safetensors split from one large model to a final onnx file?
Thanks!
If your fine-tuned model can be loaded with Hugging Face's AutoModelForCausalLM.from_pretrained
method, then the model builder can produce the final ONNX model from any number of .safetensors
files.
It does work!
Thank you very much for your soon feedback!
Hi dear,
BTW, I wonder in real application, which one should I use? model.onnx or model.onnx.data? What's the difference between them?
I get it!
Thank you again!