loading as llama model
#4
by
KnutJaegersberg
- opened
seems to load as DeciLMForCausalLM, is it possible to load it as llama for easy compatibility?
Sorry, our custom architecture is not supported by modeling_llama.py
KnutJaegersberg
changed discussion status to
closed