Update modeling_mpt.py
Added inputs_embeds
parameter.
Hi inputs_embeds
is not implemented inside the modeling code. It is not sufficient to just add the parameter. Could you please explain why you need this parameter?
@daking so it could be trained using LoRA. PEFT is not able to run.
Also, I have a question that is not related to this issue. Is mosaicml/mpt-7b the same as togethercomputer/RedPajama-INCITE-Base-7B-v0.1? Or what are the differences between those 2? Edit: mpt-7b is better. But I'm not sure how to train it with LoRA since it is not supported... Would be good to add some support
They are completely different models. mosaicml/mpt-7b
is trained and released by MosaicML, and togethercomputer/RedPajama-INCITE-Base-7B-v0.1
is trained and released by togethercomputer
. Could you please move the peft/lora discussion and any issues you have to https://github.com/mosaicml/llm-foundry/issues/64?