Siglip: add _no_split_module
#20
by
eddtsoi
- opened
No description provided.
adding _no_split_module for loading model with device_map="auto"
Thanks for your contribution, but may be it's not necessary to add _no_split_modules
in model code
Users can set no_split_module_classes
in load_checkpoint_and_dispatch
like:
model = load_checkpoint_and_dispatch(model, model_name, dtype=torch.bfloat16,
device_map="auto", no_split_module_classes=['SiglipTextEmbeddings', 'SiglipEncoderLayer'])
@finalf0 Thank you for your comment!
I was trying to deploy it on AWS sagemaker with DJL. And it was configured to load without the option of no_split_module_classes. Let me know if I missed some details.
I checked the Siglip implementation in transformers 4.44.2, they also added these properties. Maybe import from the library is a better way?