vkhylenko's picture
added metrics
1f568dd
|
raw
history blame
No virus
775 Bytes
metadata
license: openrail++
datasets:
  - ukr-detect/ukr-toxicity-dataset
language:
  - uk
widget:
  - text: Ти неймовірна!

Binary toxicity classifier for Ukrainian

This is the fine-tuned on the Ukrainian toxicity classification dataset "xlm-roberta-base" instance.

The evaluation metrics for binary toxicity classification on a test set are:

Metric Value
F1-score 0.99
Precision 0.99
Recall 0.99
Accuracy 0.99

How to use:

from transformers import pipeline

classifier = pipeline("text-classification",
                       model="ukr-detect/ukr-toxicity-classifier")