jcblaise commited on
Commit
a7ff472
1 Parent(s): e5dcd63

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -10
README.md CHANGED
@@ -9,6 +9,10 @@ license: gpl-3.0
9
  inference: false
10
  ---
11
 
 
 
 
 
12
  # DistilBERT Tagalog Base Cased
13
  Tagalog version of DistilBERT, distilled from [`bert-tagalog-base-cased`](https://huggingface.co/jcblaise/bert-tagalog-base-cased). This model is part of a larger research project. We open-source the model to allow greater usage within the Filipino NLP community.
14
 
@@ -32,15 +36,6 @@ Finetuning scripts and other utilities we use for our projects can be found in o
32
  All model details and training setups can be found in our papers. If you use our model or find it useful in your projects, please cite our work:
33
 
34
  ```
35
- @inproceedings{localization2020cruz,
36
- title={{Localization of Fake News Detection via Multitask Transfer Learning}},
37
- author={Cruz, Jan Christian Blaise and Tan, Julianne Agatha and Cheng, Charibeth},
38
- booktitle={Proceedings of The 12th Language Resources and Evaluation Conference},
39
- pages={2589--2597},
40
- year={2020},
41
- url={https://www.aclweb.org/anthology/2020.lrec-1.315}
42
- }
43
-
44
  @article{cruz2020establishing,
45
  title={Establishing Baselines for Text Classification in Low-Resource Languages},
46
  author={Cruz, Jan Christian Blaise and Cheng, Charibeth},
@@ -60,4 +55,4 @@ All model details and training setups can be found in our papers. If you use our
60
  Data used to train this model as well as other benchmark datasets in Filipino can be found in my website at https://blaisecruz.com
61
 
62
  ## Contact
63
- If you have questions, concerns, or if you just want to chat about NLP and low-resource languages in general, you may reach me through my work email at jan_christian_cruz@dlsu.edu.ph
 
9
  inference: false
10
  ---
11
 
12
+ **Deprecation Notice**
13
+ This model is deprecated. New Filipino Transformer models trained with a much larger corpora are available.
14
+ Use [`jcblaise/roberta-tagalog-base`](https://huggingface.co/jcblaise/roberta-tagalog-base) or [`jcblaise/roberta-tagalog-large`](https://huggingface.co/jcblaise/roberta-tagalog-large) instead for better performance.
15
+
16
  # DistilBERT Tagalog Base Cased
17
  Tagalog version of DistilBERT, distilled from [`bert-tagalog-base-cased`](https://huggingface.co/jcblaise/bert-tagalog-base-cased). This model is part of a larger research project. We open-source the model to allow greater usage within the Filipino NLP community.
18
 
 
36
  All model details and training setups can be found in our papers. If you use our model or find it useful in your projects, please cite our work:
37
 
38
  ```
 
 
 
 
 
 
 
 
 
39
  @article{cruz2020establishing,
40
  title={Establishing Baselines for Text Classification in Low-Resource Languages},
41
  author={Cruz, Jan Christian Blaise and Cheng, Charibeth},
 
55
  Data used to train this model as well as other benchmark datasets in Filipino can be found in my website at https://blaisecruz.com
56
 
57
  ## Contact
58
+ If you have questions, concerns, or if you just want to chat about NLP and low-resource languages in general, you may reach me through my work email at me@blaisecruz.com