Text Generation
Transformers
multilingual
Inference Endpoints

adapter config issue?

#1
by jonabur - opened

The adapter config lists the base model as:
"base_model_name_or_path": "meta-llama/Llama-2-7b",
which isn't working for me, because it's not in hf format.

But your instructions refer to meta-llama/Llama-2-7b-hf.

Should the adapter config be updated to meta-llama/Llama-2-7b-hf?

MaLA-LM org

Apologies for the typo. adapter config is now updated. thanks.

jisx changed discussion status to closed

Sign up or log in to comment