Jingya HF staff commited on
Commit
d07b570
1 Parent(s): 56871c0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +38 -0
README.md CHANGED
@@ -1,3 +1,41 @@
1
  ---
2
  license: openrail++
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: openrail++
3
  ---
4
+
5
+ [`latent-consistency/lcm-sdxl`](https://huggingface.co/latent-consistency/lcm-sdxl) compiled on an AWS Inf2 instance. ***INF2/TRN1 ONLY***
6
+
7
+ ***How to use***
8
+
9
+ ```python
10
+ from optimum.neuron import NeuronStableDiffusionXLPipeline
11
+
12
+ pipe = NeuronStableDiffusionXLPipeline.from_pretrained("Jingya/lcm-sdxl-neuronx")
13
+
14
+ num_images_per_prompt = 2
15
+ prompt = ["a close-up picture of an old man standing in the rain"] * num_images_per_prompt
16
+
17
+ images = pipe(prompt=prompt, num_inference_steps=4, guidance_scale=8.0).images
18
+ ```
19
+
20
+ If you are using a later neuron compiler version, you can compile the checkpoint yourself with the following lines via [`🤗 optimum-neuron`](https://huggingface.co/docs/optimum-neuron/index) (the compilation takes approximately an hour):
21
+
22
+ ```python
23
+ from optimum.neuron import NeuronStableDiffusionXLPipeline
24
+
25
+ model_id = "stabilityai/stable-diffusion-xl-base-1.0"
26
+ unet_id = "latent-consistency/lcm-sdxl"
27
+ num_images_per_prompt = 1
28
+ input_shapes = {"batch_size": 1, "height": 1024, "width": 1024, "num_images_per_prompt": num_images_per_prompt}
29
+ compiler_args = {"auto_cast": "matmul", "auto_cast_type": "bf16"}
30
+
31
+ stable_diffusion = NeuronStableDiffusionXLPipeline.from_pretrained(
32
+ model_id, unet_id=unet_id, export=True, **compiler_args, **input_shapes
33
+ )
34
+ save_directory = "lcm_sdxl_neuron/"
35
+ stable_diffusion.save_pretrained(save_directory)
36
+
37
+ # Push to hub
38
+ stable_diffusion.push_to_hub(save_directory, repository_id="Jingya/lcm-sdxl-neuronx", use_auth_token=True)
39
+ ```
40
+
41
+ And feel free to make a pull request and contribute to this repo, thx 🤗!