xlmr-large-all-CLS-BT / test_eval_ru.txt
HHansi's picture
Upload folder using huggingface_hub
3cb9af2 verified
raw
history blame
1.32 kB
Default classification report:
precision recall f1-score support
F 0.7375 0.7640 0.7505 500
T 0.7552 0.7280 0.7413 500
accuracy 0.7460 1000
macro avg 0.7463 0.7460 0.7459 1000
weighted avg 0.7463 0.7460 0.7459 1000
ADJ
Accuracy = 0.7333333333333333
Weighted Recall = 0.7333333333333333
Weighted Precision = 0.7472096530920059
Weighted F1 = 0.7370370370370372
Macro Recall = 0.7320574162679425
Macro Precision = 0.7194570135746606
Macro F1 = 0.7222222222222223
ADV
Accuracy = 0.4375
Weighted Recall = 0.4375
Weighted Precision = 0.5113636363636364
Weighted F1 = 0.42647058823529416
Macro Recall = 0.4833333333333333
Macro Precision = 0.4818181818181818
Macro F1 = 0.43529411764705883
NOUN
Accuracy = 0.7491408934707904
Weighted Recall = 0.7491408934707904
Weighted Precision = 0.749099279228316
Weighted F1 = 0.7491112344331189
Macro Recall = 0.7487943262411347
Macro Precision = 0.7489120151371806
Macro F1 = 0.7488443030940755
VERB
Accuracy = 0.7553763440860215
Weighted Recall = 0.7553763440860215
Weighted Precision = 0.7561659946236561
Weighted F1 = 0.7553286126409651
Macro Recall = 0.7557030098013704
Macro Precision = 0.7559027777777778
Macro F1 = 0.755360433604336