Negative Prompt in Inference api

#21
by Ashed00 - opened

I needed to know whether we can use negative prompt in inference api. If so, How? Thank You in advance.

I don't know if we can use Negative Prompt in the Infernce API, but we can use Negative Prompt in the HuggingFace Spaces API.

The following space (https://huggingface.co/spaces/cagliostrolab/animagine-xl-3.1) can be used in Python or JavaScript.

Open the link and press "Use via API" at the bottom to see the detailed usage.

Ohh great. However this error keeps popping up. Can you help with this @rit3776

gradio_client.exceptions.AppError: The upstream Gradio app has raised an exception but has not enabled verbose error reporting. To enable, set show_error=True in launch().

Can you show me the full text of the code you executed?
I'm not a professional, but I will help you.

Thank You for responding. However I have fixed the error. To show the error, I had to put verbose=True in Client call. I was facing the error since I did not define my model type ( one out of these 'Standard v3.0', 'Standard v3.1', 'Light v3.1', 'Heavy v3.1' ). This seemed to have solved the issue.

Ashed00 changed discussion status to closed

Sign up or log in to comment