Update README.md
Browse files
README.md
CHANGED
@@ -19,4 +19,4 @@ The model architecture is the same as the original OpenLLama model; 12 layers, 7
|
|
19 |
## Training Data
|
20 |
|
21 |
The models are trained on the Vietnamese version of Wikipedia.
|
22 |
-
The generated corpus files are 1.5GB in total, containing approximately
|
|
|
19 |
## Training Data
|
20 |
|
21 |
The models are trained on the Vietnamese version of Wikipedia.
|
22 |
+
The generated corpus files are 1.5GB in total, containing approximately 1.3M sentences.
|