Adding `safetensors` variant of this model
This is an automated PR created with https://huggingface.co/spaces/safetensors/convert
This new file is equivalent to pytorch_model.bin
but safe in the sense that
no arbitrary code can be put into it.
These files also happen to load much faster than their pytorch counterpart:
https://colab.research.google.com/github/huggingface/notebooks/blob/main/safetensors_doc/en/speed.ipynb
The widgets on your model page will run using this model even if this is not merged
making sure the file actually works.
If you find any issues: please report here: https://huggingface.co/spaces/safetensors/convert/discussions
Feel free to ignore this PR.
Friendly ping to @lysandre and @joaogante . Is it safe to merge this PR?
@osanseviero This is our own bot, so it should be! I see no limitations of gpt2 + safetensors from the transformers side
Verified that the two checkpoints had equal layers of equal values, merging!
The safetensors Space also does it, but for super widely used checkpoints like this I find it important to double check 😀
In particular an inference test is super important, as safetensors tied weights are not managed the same way as pyorch's bin weights.
See #26292 and #26422 which were necessary after merging safetensors weights.
If your script would have prevented these from happening, would love to automate it!
Thanks