Spaces:
Running
on
Zero
Base models
Will you be updating the base models?
What do I change to add some on my own alternatively? Found JibMix v5 on here and would like to add.
Will you be updating the base models?
Come to think of it, I forgot to convert and add models for FLUX. I don't have any particular plans, but I'll add them when I can.
What do I change to add some on my own alternatively?
https://huggingface.co/spaces/John6666/flux-lora-the-explorer/blob/main/env.py#L13
Just add it to the list and you're done. If it's at the top of the list, it will be the default.
If you don't need to add it to the list, you can just enter it directly into the base model selection drop-box on the GUI and it will be loaded.
Hmm, they haven’t tagged it as diffusers so I’m getting an error. It’s jibmix v5
Oh... I've updated it and added it, but if there's an error with that, it might be for a different reason.
Sorry, I didn’t see how active you’ve been the last hour. Haven’t tried it yet, but your version looks much more correct.
This space have pretty much everything except inpainting now. Is that hard to implement?
Alternatively, if you specify the file name like this instead of the repo, you can use files that are not for Diffusers. At the moment, this is not possible with GGUF or NF4...
https://huggingface.co/datasets/John6666/flux1-backup-202411/blob/main/thirstTrapGirlTiktok_v10.safetensors
inpainting
Was it in the Advanced tab?
Input for i2i yes, not inpaint with masking
inpaint with masking
I see. It's late here, so I'll have a look at it tomorrow.
I've turned off Control Net because Diffusers are prone to bugs, but if it's possible to do it with Inpaint, then it might be possible.
Getting this error after trying some of the new models by adding them to env.py:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 624, in process_events
response = await route_utils.call_process_api(
File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 323, in call_process_api
output = await app.get_blocks().process_api(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 2018, in process_api
result = await self.call_function(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1567, in call_function
prediction = await anyio.to_thread.run_sync( # type: ignore
File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2441, in run_sync_in_worker_thread
return await future
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 943, in run
result = context.run(func, *args)
File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 846, in wrapper
response = f(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 846, in wrapper
response = f(*args, **kwargs)
File "/home/user/app/app.py", line 128, in change_base_model
raise gr.Error(f"Model load Error: {repo_id} {e}") from e
gradio.exceptions.Error: "Model load Error: John6666/acorn-is-spinning-flux-aisfluxdedistilled-fp8-flux Pipeline <class 'diffusers.pipelines.flux.pipeline_flux.FluxPipeline'> expected {'tokenizer', 'vae', 'text_encoder', 'tokenizer_2', 'scheduler', 'transformer', 'text_encoder_2'}, but only {'text_encoder', 'tokenizer_2', 'tokenizer', 'scheduler', 'vae'} were passed."
Thanks for the report. Maybe a different implementation is needed for the De-Distilled version.
That one may no longer be strictly flux.
The inpainting masks don't seem to be difficult except for the time it takes to create the GUI.😃
Let me know if you have any good inpaint GUI Spaces that would be helpful.
I'm planning to divert DiffuseCraft's for now.
I’ve only duplicated this one, which has been working fine for its intended purpose. But I’d love to have it with a lora gallery like here.
Sham786/flux-inpainting-with-lora
I see the original has a runtime error now for some reason.