Code used to generate the model

#10
by Proryanator - opened

Hey there! I've been trying to convert the same base model to CoreML using exporters (w/o any custom configs). I am able to convert the model but, not able to use it.

I used this:

python -m exporters.coreml --model=meta-llama/Llama-2-7b-chat-hf llama-2-7b-chat-hf.mlpackage --feature=text-generation

My guess though comparing the inputs of my generated model though is that the output of the model may be the issue here. My generated model shows this as the output:

MultiArray (Float32 1 × 128 x 32000)

Whereas this one shows:

MultiArray (Float32)

Does this much matter? Really curious on how you converted this model, since I'd like to reproduce it for my own projects/other models.

Sign up or log in to comment