File size: 6,301 Bytes
6b0fe9d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
- precision
- recall
model-index:
- name: distilbert-base-uncased-xsum-factuality
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# distilbert-base-uncased-xsum-factuality

This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6850
- Accuracy: 0.6332
- F1: 0.6212
- Precision: 0.6526
- Recall: 0.6332

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 7

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1     | Precision | Recall |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
| 0.7275        | 0.13  | 20   | 0.6961          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.672         | 0.27  | 40   | 0.6959          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.6743        | 0.4   | 60   | 0.6958          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.7083        | 0.53  | 80   | 0.6954          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.7069        | 0.67  | 100  | 0.6950          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.7094        | 0.8   | 120  | 0.6944          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.6825        | 0.93  | 140  | 0.6939          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.6965        | 1.07  | 160  | 0.6934          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.6848        | 1.2   | 180  | 0.6924          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.6991        | 1.33  | 200  | 0.6916          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.6803        | 1.47  | 220  | 0.6916          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.6991        | 1.6   | 240  | 0.6918          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.718         | 1.73  | 260  | 0.6910          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.6908        | 1.87  | 280  | 0.6905          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.7071        | 2.0   | 300  | 0.6903          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.6866        | 2.13  | 320  | 0.6902          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.7129        | 2.27  | 340  | 0.6897          | 0.5      | 0.3333 | 0.25      | 0.5    |
| 0.6852        | 2.4   | 360  | 0.6895          | 0.4985   | 0.3327 | 0.2496    | 0.4985 |
| 0.686         | 2.53  | 380  | 0.6888          | 0.4985   | 0.3479 | 0.4804    | 0.4985 |
| 0.7026        | 2.67  | 400  | 0.6888          | 0.5030   | 0.3501 | 0.5508    | 0.5030 |
| 0.709         | 2.8   | 420  | 0.6882          | 0.5015   | 0.3494 | 0.5231    | 0.5015 |
| 0.7102        | 2.93  | 440  | 0.6877          | 0.5150   | 0.4151 | 0.5472    | 0.5150 |
| 0.7141        | 3.07  | 460  | 0.6877          | 0.5135   | 0.4142 | 0.5418    | 0.5135 |
| 0.6761        | 3.2   | 480  | 0.6874          | 0.5195   | 0.4375 | 0.5467    | 0.5195 |
| 0.6923        | 3.33  | 500  | 0.6872          | 0.5165   | 0.4355 | 0.5386    | 0.5165 |
| 0.6735        | 3.47  | 520  | 0.6873          | 0.5195   | 0.4375 | 0.5467    | 0.5195 |
| 0.6907        | 3.6   | 540  | 0.6871          | 0.5195   | 0.4375 | 0.5467    | 0.5195 |
| 0.7049        | 3.73  | 560  | 0.6872          | 0.5090   | 0.4114 | 0.5267    | 0.5090 |
| 0.6839        | 3.87  | 580  | 0.6868          | 0.5195   | 0.4375 | 0.5467    | 0.5195 |
| 0.6914        | 4.0   | 600  | 0.6867          | 0.5374   | 0.4734 | 0.5729    | 0.5374 |
| 0.6785        | 4.13  | 620  | 0.6867          | 0.5210   | 0.4385 | 0.5508    | 0.5210 |
| 0.6806        | 4.27  | 640  | 0.6864          | 0.5329   | 0.4701 | 0.5626    | 0.5329 |
| 0.6832        | 4.4   | 660  | 0.6863          | 0.5734   | 0.5362 | 0.6079    | 0.5734 |
| 0.676         | 4.53  | 680  | 0.6863          | 0.5479   | 0.4940 | 0.5835    | 0.5479 |
| 0.6957        | 4.67  | 700  | 0.6861          | 0.5644   | 0.5219 | 0.5998    | 0.5644 |
| 0.6786        | 4.8   | 720  | 0.6860          | 0.5838   | 0.5497 | 0.6204    | 0.5838 |
| 0.6845        | 4.93  | 740  | 0.6860          | 0.5689   | 0.5255 | 0.6086    | 0.5689 |
| 0.6917        | 5.07  | 760  | 0.6858          | 0.6108   | 0.5895 | 0.6397    | 0.6108 |
| 0.6941        | 5.2   | 780  | 0.6856          | 0.6213   | 0.6031 | 0.6485    | 0.6213 |
| 0.6904        | 5.33  | 800  | 0.6855          | 0.6332   | 0.6176 | 0.6593    | 0.6332 |
| 0.6722        | 5.47  | 820  | 0.6854          | 0.6332   | 0.6176 | 0.6593    | 0.6332 |
| 0.6947        | 5.6   | 840  | 0.6853          | 0.6362   | 0.6239 | 0.6568    | 0.6362 |
| 0.706         | 5.73  | 860  | 0.6852          | 0.6347   | 0.6225 | 0.6547    | 0.6347 |
| 0.6733        | 5.87  | 880  | 0.6852          | 0.6392   | 0.6266 | 0.6611    | 0.6392 |
| 0.6925        | 6.0   | 900  | 0.6851          | 0.6437   | 0.6306 | 0.6676    | 0.6437 |
| 0.6782        | 6.13  | 920  | 0.6851          | 0.6377   | 0.6252 | 0.6589    | 0.6377 |
| 0.7056        | 6.27  | 940  | 0.6851          | 0.6377   | 0.6252 | 0.6589    | 0.6377 |
| 0.6972        | 6.4   | 960  | 0.6850          | 0.6332   | 0.6212 | 0.6526    | 0.6332 |
| 0.7065        | 6.53  | 980  | 0.6850          | 0.6332   | 0.6212 | 0.6526    | 0.6332 |
| 0.6754        | 6.67  | 1000 | 0.6850          | 0.6317   | 0.6199 | 0.6505    | 0.6317 |
| 0.6751        | 6.8   | 1020 | 0.6850          | 0.6332   | 0.6212 | 0.6526    | 0.6332 |
| 0.6904        | 6.93  | 1040 | 0.6850          | 0.6332   | 0.6212 | 0.6526    | 0.6332 |


### Framework versions

- Transformers 4.35.0
- Pytorch 2.0.1
- Datasets 2.14.6
- Tokenizers 0.14.1