Ship modeling code without `einops` and open a PR to upstream modeling code to be loaded without `trust_remote_code`

#16
by michaelfeil - opened

Hey there,

Not happy about the dependencies that need to be delivered with trust_remote_code.

Please upstream your modeling code into the transformers library. That way it does not need to be loaded via trust_remote_code. Arguably, the einops is not really needed, and can be replaced with default pytorch ops.

Thanks for understanding!

https://github.com/michaelfeil/infinity/issues/197
https://github.com/michaelfeil/infinity/issues/185

Nomic AI org

I'll see if I have time this week for the transformers but if you have a quick PR for it we'd love contributions to remove the einops dependency :) https://github.com/nomic-ai/contrastors/blob/main/src/contrastors/models/huggingface/modeling_hf_nomic_bert.py

Is it related to the fact that I cannot deploy it on a dedicated inference endpoint?
I have to following error by doing so:

image.png

Nomic AI org
edited May 21

I don't have any experience with deploying to a dedicated endpoint via Huggingface but it looks like that might be the case?

If you would like a dedicated endpoint, we do have two options:

Let me know if these fit your needs! Happy to chat as well

@zpn This is hilarious - I am maintainer of a an open source repo and you guys are just refusing to write compatible modeling code. (for convenience). I understand that you are looking into monetization strategies, but an issue thread is the wrong place for it. Unless fixed, I will start to discourage usage of nomics models.
Best Michael Feil

Nomic AI org

Hey @michaelfeil , I'm sorry you feel this way. To add some more context, I am the sole maintainer of the package and have been heads down on new features. I am not refusing to write the code, it's the case where I haven't had time yet to get to it and this fell lower down on the priorities.

As always, you are welcome to make the PR yourself as well if you don't want to wait for me to get around to this! The codebase and Transformers are both open-source.

Please bear with me

Sure, just trying to propagate the feedback about frustrated devs blaming and spending a decent amount of weekend time fixing modeling code!

Nomic AI org

@michaelfeil please don't be mean to folks maintaining open source software - we developed (and open sourced) our own training libraries and architecture implementations to make nomic embed possible so unfortunately theres quite a bit of work to have it integrate nicely with huggingface at every possible touchpoint.

you should channel your energy into a pull request to fix it as opposed to berating the team in public channels about it not working out-of-the-box for your integrations into third party libraries like infinity. In-fact, i'll even review the PR myself!

:)

michaelfeil changed discussion status to closed

Understood, closing.

Hey all, wanted to know that starting from 0.0.37, einops is optional default dependency in infinity.

michaelfeil changed discussion status to open
michaelfeil changed discussion status to closed
Nomic AI org

What precisely is blocking use in infinity of the remote code past einops support? If you can outline a path to get it working there, we'll see what we can do on our end!

andriym changed discussion status to open

No, I fixed it entirely - you should be able to to run it now.

zpn changed discussion status to closed

Sign up or log in to comment