Transformers doesn't support it yet?
Getting this error while trying to make a Space for it.
ValueError: The checkpoint you are trying to load has model type `bunny-qwen` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
Files missing ? Bunny its from https://github.com/BAAI-DCAI/Bunny and also it should be bunny-qwen2 ? like in Bunny Qwen models but it is customized i guess so ....
It probably needs it's own version of these
],
"auto_map": {
"AutoConfig": "configuration_llava_qwen2.LlavaQwen2Config",
"AutoModelForCausalLM": "modeling_llava_qwen2.LlavaQwen2ForCausalLM"
},
Dolphin-Vision-72B has them and that seems to work fine
I added a copy to the model myself and it will now load into memory. It just won't inference
It probably needs it's own version of these
],
"auto_map": {
"AutoConfig": "configuration_llava_qwen2.LlavaQwen2Config",
"AutoModelForCausalLM": "modeling_llava_qwen2.LlavaQwen2ForCausalLM"
},Dolphin-Vision-72B has them and that seems to work fine
I added a copy to the model myself and it will now load into memory. It just won't inference
It bunny not llava
It bunny not llava
It needs the same kind of files, just for bunny instead. These files just give transformers the info required to start without giving the unknown model type error. Which this model does, from what I think is due to the lack of similar files?
Something else interesting is that the llava config files only mention 7b while being in the 72b repo, it's confusing
Edit: it might be confusing, this is the account I use on mobile. I'm also @saishf
I'm getting the same error "The checkpoint you are trying to load has model type bunny-qwen
but Transformers does not recognize this
architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date."
i updated the files