Update README.md
Browse files
README.md
CHANGED
@@ -20,7 +20,7 @@ This model is released as part of the project ["IT5: Large-Scale Text-to-Text Pr
|
|
20 |
|
21 |
## Model variants
|
22 |
|
23 |
-
This repository contains the checkpoints for the `base` version of the model. The model was trained for one epoch (1.05M steps) on the [Thoroughly Cleaned Italian mC4 Corpus](https://huggingface.co/datasets/gsarti/clean_mc4_it) (~41B words, ~275GB) using
|
24 |
|
25 |
The following table summarizes the parameters for all available models
|
26 |
|
|
|
20 |
|
21 |
## Model variants
|
22 |
|
23 |
+
This repository contains the checkpoints for the `base` version of the model. The model was trained for one epoch (1.05M steps) on the [Thoroughly Cleaned Italian mC4 Corpus](https://huggingface.co/datasets/gsarti/clean_mc4_it) (~41B words, ~275GB) using 🤗 Datasets and the `google/t5-v1_1-small` improved configuration. The training procedure is made available [on Github](https://github.com/gsarti/t5-flax-gcp).
|
24 |
|
25 |
The following table summarizes the parameters for all available models
|
26 |
|