AlekseyCalvin commited on
Commit
f581f61
1 Parent(s): 68fd13e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -149,7 +149,7 @@ Model trained with [AI Toolkit by Ostris](https://github.com/ostris/ai-toolkit)
149
  Fine-tuned using the **Google Colab Notebook*** Version of **ai-toolkit**.<br>
150
  I've used A100 via Colab Pro.
151
  However, training SD3.5 may potentially work with Free Colab or lower Vram in general:<br>
152
- Especially if one were to use: say, *lower rank (try 4 or 8), dataset size (in terms of caching/bucketing/pre-loading impacts), 1 batch size, Adamw8bit optimizer, 512 resolution, maybe adding the /lowvram, true/ argument, and plausibly specifying alternate quantization variants.* <br>
153
  Generally, VRAM expenditures tend to be lower than for Flux during training. So, try it! I certainly will.<br>
154
  **To use on Colab**, modify a Flux template Notebook from [here](https://github.com/ostris/ai-toolkit/tree/main/notebooks) with parameters from Ostris' example config for SD3.5 [here](https://github.com/ostris/ai-toolkit/blob/main/config/examples/train_lora_sd35_large_24gb.yaml)!
155
  ```
 
149
  Fine-tuned using the **Google Colab Notebook*** Version of **ai-toolkit**.<br>
150
  I've used A100 via Colab Pro.
151
  However, training SD3.5 may potentially work with Free Colab or lower Vram in general:<br>
152
+ Especially if one were to use:<br> ...Say, *lower rank (try 4 or 8), dataset size (in terms of caching/bucketing/pre-loading impacts), 1 batch size, Adamw8bit optimizer, 512 resolution, maybe adding the /lowvram, true/ argument, and plausibly specifying alternate quantization variants.* <br>
153
  Generally, VRAM expenditures tend to be lower than for Flux during training. So, try it! I certainly will.<br>
154
  **To use on Colab**, modify a Flux template Notebook from [here](https://github.com/ostris/ai-toolkit/tree/main/notebooks) with parameters from Ostris' example config for SD3.5 [here](https://github.com/ostris/ai-toolkit/blob/main/config/examples/train_lora_sd35_large_24gb.yaml)!
155
  ```