Update README.md
Browse files
README.md
CHANGED
@@ -8,6 +8,10 @@ license: gpl-3.0
|
|
8 |
inference: false
|
9 |
---
|
10 |
|
|
|
|
|
|
|
|
|
11 |
# BERT Tagalog Base Cased (Whole Word Masking)
|
12 |
Tagalog version of BERT trained on a large preprocessed text corpus scraped and sourced from the internet. This model is part of a larger research project. We open-source the model to allow greater usage within the Filipino NLP community. This particular version uses whole word masking.
|
13 |
|
|
|
8 |
inference: false
|
9 |
---
|
10 |
|
11 |
+
**Deprecation Notice**
|
12 |
+
This model is deprecated. New Filipino Transformer models trained with a much larger corpora are available.
|
13 |
+
Use [`jcblaise/roberta-tagalog-base`](https://huggingface.co/jcblaise/roberta-tagalog-base) or [`jcblaise/roberta-tagalog-large`](https://huggingface.co/jcblaise/roberta-tagalog-large) instead for better performance.
|
14 |
+
|
15 |
# BERT Tagalog Base Cased (Whole Word Masking)
|
16 |
Tagalog version of BERT trained on a large preprocessed text corpus scraped and sourced from the internet. This model is part of a larger research project. We open-source the model to allow greater usage within the Filipino NLP community. This particular version uses whole word masking.
|
17 |
|