xlmr-large-all-CLS-BT / test_eval_en.txt
HHansi's picture
Upload folder using huggingface_hub
3cb9af2 verified
raw
history blame
1.28 kB
Default classification report:
precision recall f1-score support
F 0.8941 0.8780 0.8860 500
T 0.8802 0.8960 0.8880 500
accuracy 0.8870 1000
macro avg 0.8871 0.8870 0.8870 1000
weighted avg 0.8871 0.8870 0.8870 1000
ADJ
Accuracy = 0.875
Weighted Recall = 0.875
Weighted Precision = 0.87578125
Weighted F1 = 0.8747086247086249
Macro Recall = 0.8730650154798762
Macro Precision = 0.8765625
Macro F1 = 0.8741258741258742
ADV
Accuracy = 0.7333333333333333
Weighted Recall = 0.7333333333333333
Weighted Precision = 0.7642857142857142
Weighted F1 = 0.7357142857142857
Macro Recall = 0.75
Macro Precision = 0.7410714285714286
Macro F1 = 0.7321428571428572
NOUN
Accuracy = 0.8958333333333334
Weighted Recall = 0.8958333333333334
Weighted Precision = 0.8958405073461891
Weighted F1 = 0.8958337069811766
Macro Recall = 0.8958390128416673
Macro Precision = 0.8958333333333333
Macro F1 = 0.8958329596854901
VERB
Accuracy = 0.8926174496644296
Weighted Recall = 0.8926174496644296
Weighted Precision = 0.8926174496644296
Weighted F1 = 0.8926174496644297
Macro Recall = 0.8926174496644296
Macro Precision = 0.8926174496644296
Macro F1 = 0.8926174496644297