pr_1.5b_500 / generation_config.json
mogaio's picture
Upload MixFormerSequentialForCausalLM (#1)
efd1bce
raw
history blame contribute delete
69 Bytes
{
"_from_model_config": true,
"transformers_version": "4.35.0"
}