Update README.md
Browse files
README.md
CHANGED
@@ -28,12 +28,4 @@ print(output[0]['generated_text'].replace(' ', ''))
|
|
28 |
|
29 |
**Note**: Please use the `BertTokenizer` for the model vocabulary. DO NOT use the original `BartTokenizer`.
|
30 |
|
31 |
-
## Training Details
|
32 |
|
33 |
-
- Optimiser: SGD 0.03 + Adaptive Gradient Clipping 0.1
|
34 |
-
- Dataset: 172937863 sentences, pad or truncate to 64 tokens
|
35 |
-
- Batch size: 640
|
36 |
-
- Number of epochs: 7 epochs + 61440 steps
|
37 |
-
- Time: 44.0 hours on Google Cloud TPU v4-16
|
38 |
-
|
39 |
-
WandB link: [`1j7zs802`](https://wandb.ai/ayaka/bart-base-cantonese/runs/1j7zs802)
|
|
|
28 |
|
29 |
**Note**: Please use the `BertTokenizer` for the model vocabulary. DO NOT use the original `BartTokenizer`.
|
30 |
|
|
|
31 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|