robzchhangte commited on
Commit
e7d40c3
1 Parent(s): 48fbb5f

Edited Readme file

Browse files
Files changed (1) hide show
  1. README.md +21 -0
README.md CHANGED
@@ -49,3 +49,24 @@ for prediction in predictions:
49
  print(prediction["sequence"].replace("[CLS]", "").replace("[SEP]", "").strip(), "| Score:", prediction["score"])
50
 
51
  ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
49
  print(prediction["sequence"].replace("[CLS]", "").replace("[SEP]", "").strip(), "| Score:", prediction["score"])
50
 
51
  ```
52
+
53
+ **Citation**
54
+ @article{10.1145/3666003,
55
+ author = {Lalramhluna, Robert and Dash, Sandeep and Pakray, Dr.Partha},
56
+ title = {MizBERT: A Mizo BERT Model},
57
+ year = {2024},
58
+ issue_date = {July 2024},
59
+ publisher = {Association for Computing Machinery},
60
+ address = {New York, NY, USA},
61
+ volume = {23},
62
+ number = {7},
63
+ issn = {2375-4699},
64
+ url = {https://doi.org/10.1145/3666003},
65
+ doi = {10.1145/3666003},
66
+ abstract = {This research investigates the utilization of pre-trained BERT transformers within the context of the Mizo language. BERT, an abbreviation for Bidirectional Encoder Representations from Transformers, symbolizes Google’s forefront neural network approach to Natural Language Processing (NLP), renowned for its remarkable performance across various NLP tasks. However, its efficacy in handling low-resource languages such as Mizo remains largely unexplored. In this study, we introduce MizBERT, a specialized Mizo language model. Through extensive pre-training on a corpus collected from diverse online platforms, MizBERT has been tailored to accommodate the nuances of the Mizo language. Evaluation of MizBERT’s capabilities is conducted using two primary metrics: masked language modeling and perplexity, yielding scores of 76.12\% and 3.2565, respectively. Additionally, its performance in a text classification task is examined. Results indicate that MizBERT outperforms both the Multilingual BERT model and the Support Vector Machine algorithm, achieving an accuracy of 98.92\%. This underscores MizBERT’s proficiency in understanding and processing the intricacies inherent in the Mizo language.},
67
+ journal = {ACM Trans. Asian Low-Resour. Lang. Inf. Process.},
68
+ month = {jun},
69
+ articleno = {99},
70
+ numpages = {14},
71
+ keywords = {Mizo, BERT, pre-trained language model}
72
+ }