xlmr-large-all-CLS-B / test_eval_ar.txt
HHansi's picture
Upload folder using huggingface_hub
8861816 verified
raw
history blame
1.24 kB
Default classification report:
precision recall f1-score support
F 0.8337 0.7520 0.7907 500
T 0.7741 0.8500 0.8103 500
accuracy 0.8010 1000
macro avg 0.8039 0.8010 0.8005 1000
weighted avg 0.8039 0.8010 0.8005 1000
ADJ
Accuracy = 0.7448979591836735
Weighted Recall = 0.7448979591836735
Weighted Precision = 0.7757787325456499
Weighted F1 = 0.742094640053824
Macro Recall = 0.7557651991614256
Macro Precision = 0.7675438596491229
Macro F1 = 0.7435897435897436
ADV
Accuracy = 0.8
Weighted Recall = 0.8
Weighted Precision = 0.8
Weighted F1 = 0.8
Macro Recall = 0.6875
Macro Precision = 0.6875
Macro F1 = 0.6875
NOUN
Accuracy = 0.8036437246963563
Weighted Recall = 0.8036437246963563
Weighted Precision = 0.8076160740590006
Weighted F1 = 0.8028229150154391
Macro Recall = 0.8029016393442623
Macro Precision = 0.8080270067516879
Macro F1 = 0.8026530923228354
VERB
Accuracy = 0.8115577889447236
Weighted Recall = 0.8115577889447236
Weighted Precision = 0.8118021152438288
Weighted F1 = 0.8115518407552622
Macro Recall = 0.8116650251281663
Macro Precision = 0.8117043847241867
Macro F1 = 0.8115565993068314