Update README.md
Browse files
README.md
CHANGED
@@ -11,11 +11,26 @@ base_model:
|
|
11 |
- microsoft/deberta-v3-base
|
12 |
|
13 |
---
|
14 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
15 |
|
16 |
-
|
|
|
|
|
|
|
|
|
|
|
17 |
|
18 |
-
|
19 |
|
20 |
```python
|
21 |
import torch
|
@@ -45,8 +60,7 @@ model.eval()
|
|
45 |
|
46 |
inputs = tokenizer(
|
47 |
['Hello, Thanks for sharing your health concern with us. I have gone through your query and here are your answers: 1. If you have regular cycles, there is no further need to use any medication to regulate cycles. 2. Establishment of regular ovulation and timing of intercourse properly is necessary. 3. If you want to conceive quickly, you have to get further evaluation and plan management. Hope this helps.',
|
48 |
-
'He might have small intestinal TB rather than stomach TB. Amoebas also involves small intestine/some part of large intestine. If he has taken medicines for both diseases in form of a Complete Course, he should be fine. U can go for an oral+iv contrast CT scan of him. Now, the diagnosis of a lax cardiac can be confirmed by an upper GI endoscopy with manometry (if available). Lax cardiac may cause acidity with reflux.',
|
49 |
-
'In Micronesia during the early 1900s, there was a unique culture known as the Yap. What sets the Yap apart from other cultures is their unusual form of currency - enormous limestone disks called Rai Stones. These massive stones are not easily movable, with the largest one weighing four tons and measuring 12 feet in diameter. Instead of physically exchanging the stones, the Yap people simply keep track of ownership through an intricate system. A fascinating tale revolves around some sailors who attempted to transport a Rai Stone across the ocean but encountered difficulties along the way, resulting in the stone falling into the sea.'],
|
50 |
max_length=512,
|
51 |
truncation=True,
|
52 |
padding="max_length",
|
@@ -58,13 +72,28 @@ torch.softmax(
|
|
58 |
).detach().cpu()[:, 1].tolist()
|
59 |
```
|
60 |
|
61 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
62 |
|
63 |
-
|
64 |
|
65 |
-
|
66 |
-
| --------------------- | -------------------------- |
|
67 |
-
| 0.8307 | 0.8311 |
|
68 |
|
69 |
# Citation
|
70 |
|
|
|
11 |
- microsoft/deberta-v3-base
|
12 |
|
13 |
---
|
14 |
+
---
|
15 |
+
library_name: transformers
|
16 |
+
license: mit
|
17 |
+
datasets:
|
18 |
+
- Jinyan1/COLING_2025_MGT_en
|
19 |
+
language:
|
20 |
+
- en
|
21 |
+
metrics:
|
22 |
+
- f1
|
23 |
+
base_model:
|
24 |
+
- microsoft/deberta-v3-base
|
25 |
|
26 |
+
---
|
27 |
+
# 🏆 Winning model for the COLING 2025 Workshop on Detecting AI Generated Content (DAIGenC)
|
28 |
+
|
29 |
+
## Model description
|
30 |
+
|
31 |
+
A **binary classification model** of machine-generated fragments that achieved **first place** on the monolingual subtask in the [COLING 2025 GenAI Detection Task](https://genai-content-detection.gitlab.io). The model is a fine-tuned version of DeBERTa-v3-base in multi-task mode with a shared encoder and three parallel heads for classification. Only head is used for inference.
|
32 |
|
33 |
+
## Usage
|
34 |
|
35 |
```python
|
36 |
import torch
|
|
|
60 |
|
61 |
inputs = tokenizer(
|
62 |
['Hello, Thanks for sharing your health concern with us. I have gone through your query and here are your answers: 1. If you have regular cycles, there is no further need to use any medication to regulate cycles. 2. Establishment of regular ovulation and timing of intercourse properly is necessary. 3. If you want to conceive quickly, you have to get further evaluation and plan management. Hope this helps.',
|
63 |
+
'He might have small intestinal TB rather than stomach TB. Amoebas also involves small intestine/some part of large intestine. If he has taken medicines for both diseases in form of a Complete Course, he should be fine. U can go for an oral+iv contrast CT scan of him. Now, the diagnosis of a lax cardiac can be confirmed by an upper GI endoscopy with manometry (if available). Lax cardiac may cause acidity with reflux.'],
|
|
|
64 |
max_length=512,
|
65 |
truncation=True,
|
66 |
padding="max_length",
|
|
|
72 |
).detach().cpu()[:, 1].tolist()
|
73 |
```
|
74 |
|
75 |
+
## Limitations and bias
|
76 |
+
|
77 |
+
This model is limited to a training dataset consisting of generated and human generated texts from different sources and domains over a period of time. It may not be a good fit for all use cases in different domains. In addition, the model may have false positives in some cases, which can be varied by the classification threshold.
|
78 |
+
|
79 |
+
## Quality
|
80 |
+
|
81 |
+
Quality on the declaired test set in the competition.
|
82 |
+
|
83 |
+
Model | Main Score (F1 Macro) | Auxiliary Score (F1 Micro) |
|
84 |
+
--------------------- | --------------------- | -------------------------- |
|
85 |
+
| MTL DeBERTa-v3-base (*our*) | **0.8307** | **0.8311**
|
86 |
+
| Single-task DeBERTa-v30-base | 0.7852 | 0.7891
|
87 |
+
| *baseline* | 0.7342 | 0.7343
|
88 |
+
|
89 |
+
|
90 |
+
## Training procedure
|
91 |
+
|
92 |
+
This model was fine-tuned on train part of English version of the competition data [MGT Detection Task 1](https://huggingface.co/datasets/Jinyan1/COLING_2025_MGT_en) dataset. Class `0 - human`, `1 - machine`. Model was fine-tuned with 2 stages on a single NVIDIA RTX 3090 GPU with hyperparameters described in [our paper](https://arxiv.org/abs/2411.11736).
|
93 |
|
94 |
+
## Your Own Fine-Tune
|
95 |
|
96 |
+
If you would like to fine-tune this architecture on your data domains or base models, we offer you our learn and run code with all instructions, which we have posted on the [GitHub](https://github.com/Advacheck-OU/ai-detector-coling2025).
|
|
|
|
|
97 |
|
98 |
# Citation
|
99 |
|