Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,22 @@
|
|
1 |
-
---
|
2 |
-
license: openrail
|
3 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: openrail
|
3 |
+
---
|
4 |
+
|
5 |
+
Results from an experiment using Condition Embedding Perturbation. https://arxiv.org/pdf/2405.20494
|
6 |
+
|
7 |
+
Code is very simple, add noise during the training loop between the text_encoder forward and the Unet prediction based on this code:
|
8 |
+
|
9 |
+
perturbation_deviation = embedding_perturbation / math.sqrt(encoder_hidden_states.shape[2])
|
10 |
+
perturbation_delta = torch.randn_like(encoder_hidden_states) * (perturbation_deviation)
|
11 |
+
encoder_hidden_states = encoder_hidden_states + perturbation_delta
|
12 |
+
|
13 |
+
embedding_perturbation is the tunable lambda.
|
14 |
+
|
15 |
+
Models are SD1.5 trained for 30 epochs, Unet only, AdamW8bit, 2e-6 constant LR, batch size 12. EveryDream2Trainer config json included.
|
16 |
+
|
17 |
+
|
18 |
+
#!bin/bash
|
19 |
+
# python train.py --config train_ff7_emb_pert000.json --project_name "ff7r embedding_perturbation 0.0" --embedding_perturbation 0.0
|
20 |
+
python train.py --config train_ff7_emb_pert000.json --project_name "ff7r embedding_perturbation 1.0" --embedding_perturbation 1.0
|
21 |
+
python train.py --config train_ff7_emb_pert000.json --project_name "ff7r embedding_perturbation 1.7" --embedding_perturbation 1.7
|
22 |
+
python train.py --config train_ff7_emb_pert000.json --project_name "ff7r embedding_perturbation 3.5" --embedding_perturbation 3.5
|