Inference from finetuned model
I used your notebook to finetune the model but i dont know how to use the trained checkpoint for inference.
I would appreciate any help from you.
Thanks
Try using the adapter model as can be seen here.
Here is the adapter model card
Thanks Ngadou , i want to try and load the model from local file using the PATH variable but i recieve an error
ValueError: Can't find weights for ./checkpoint-500 in ./checkpoint-500 or in the Hugging Face Hub. Please check that the file adapter_model.bin is present at ./checkpoint-500.
./checkpoint-500 is the directory where the model files are stored after training i have uploaded an image of the files.
Do i need to upload the files to a huggigface space or can i use it locally?
Not really sure if you are trying to load a particular checkpoint from your training. Please refer to the links in my previous comment to see how I used a peft adapter model.
For the weight, I think you get it from the base_model.
Thanks , i had the directory messed up.
However i use colab to fine tune the model , but cant get inference on local machine , give a matrix multiplication error mat x mat2 dimensions dont match
have you ever encountered any such thing?