Update README.md
Browse files
README.md
CHANGED
@@ -44,7 +44,7 @@ This model was trained with the 2 Billion sample English subset of LAION-5B (htt
|
|
44 |
|
45 |
## Training Procedure
|
46 |
|
47 |
-
Training with batch size 32k for
|
48 |
|
49 |
Model is B/32 on visual side, roberta base initialized with pretrained weights on text side.
|
50 |
|
|
|
44 |
|
45 |
## Training Procedure
|
46 |
|
47 |
+
Training with batch size 32k for 12B sample of laion2B-en, see https://wandb.ai/rom1504/open-clip/reports/clip-B-32-roberta-base--VmlldzoyOTM0NDQ3
|
48 |
|
49 |
Model is B/32 on visual side, roberta base initialized with pretrained weights on text side.
|
50 |
|