AryaSuprana
commited on
Commit
•
7b7518c
1
Parent(s):
66f2d9e
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,13 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language: "ban"
|
3 |
+
datasets:
|
4 |
+
- WikiBali
|
5 |
+
- Suara Saking Bali
|
6 |
+
widget:
|
7 |
+
- text: "Kalsium silih <mask> datu kimia antuk simbol Ca miwah wilangan atom 20."
|
8 |
+
example_title: "Conto 1"
|
9 |
+
- text: "Tabuan inggih <mask> silih tunggil soroh beburon sane madue kampid."
|
10 |
+
example_title: "Conto 2"
|
11 |
+
---
|
12 |
+
|
13 |
+
BRATA (Basa Bali Used for Pretraining RoBERTa) is a pretrained language model trained using Basa Bali or Balinese Language with RoBERTa-base-uncased configuration. The datasets used for this pretraining were collected by extracting WikiBali or Wikipedia Basa Bali and some sources from Suara Saking Bali website. The pretrained language model trained using Google Colab Pro with Tesla P100-PCIE-16GB GPU. Pretraining process used 200 epoch and 2 batch size. The smallest training loss can be seen in Training metrics or Metrics tab.
|