|
--- |
|
library_name: transformers |
|
language: |
|
- ms |
|
--- |
|
|
|
# Finetune MLM Malaysian Mistral 191M on MNLI |
|
|
|
Original model https://huggingface.co/mesolitica/malaysian-mistral-191M-MLM-512, done by https://github.com/aisyahrzk https://twitter.com/aisyahhhrzk |
|
|
|
**You must use model from here https://github.com/mesolitica/malaya/blob/master/session/llm2vec/classifier.py** |
|
|
|
## dataset |
|
|
|
1. source code at https://github.com/mesolitica/malaya/tree/master/session/similarity/hf-t5 |
|
2. prepared dataset at https://huggingface.co/datasets/mesolitica/semisupervised-corpus/tree/main/similarity |
|
|
|
## Accuracy |
|
|
|
``` |
|
precision recall f1-score support |
|
|
|
0 0.84488 0.90914 0.87583 7165 |
|
1 0.92182 0.86519 0.89261 8872 |
|
|
|
accuracy 0.88483 16037 |
|
macro avg 0.88335 0.88717 0.88422 16037 |
|
weighted avg 0.88744 0.88483 0.88511 16037 |
|
``` |