Hey everyone 🤗! We (finegrain) have created some custom ComfyUI nodes to use our refiners micro-framework inside comfy! 🎉
We only support our new Box Segmenter at the moment, but we're thinking of adding more nodes since there seems to be a demand for it. We leverage the new (beta) Comfy Registry to host our nodes. They are available at: https://registry.comfy.org/publishers/finegrain/nodes/comfyui-refiners. You can install them by running:
comfy noderegistry-install comfyui-refiners
Or by unzipping the archive you can download by clicking "Download Latest" into your custom_nodes comfy folder. We are eager to hear your feedbacks and suggestions for new nodes and how you'll use them! 🙏
We (finegrain) have trained this new model in partnership with Nfinite and some of their synthetic data, the resulting model is incredibly accurate 🚀. It’s all open source under the MIT license (finegrain/finegrain-box-segmenter), complete with a test set tailored for e-commerce (finegrain/finegrain-product-masks-lite). Have fun experimenting with it!
Under the hoods, it's a pipeline of models (currently exposed via an API) that allows you to easily erase any object from your image just by naming it or selecting it! Not only will the object disappear, but so will its effects on the scene, like shadows and reflections. Built on top of Refiners, our micro-framework for simple foundation model adaptation (feel free to star it on GitHub if you like it: https://github.com/finegrain-ai/refiners)
Unfortunately, it doesn't look like it can easily be preloaded with some models, so you'll have to bring your own. Here is a quick selection of models you can use: - openai-community/gpt2 - qualcomm/ResNet50 - qualcomm/VIT