File size: 14,783 Bytes
70f4ff8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
---
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: bert-base-cased-finetuned-ner-cadec-no-iob
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# bert-base-cased-finetuned-ner-cadec-no-iob

This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4487
- Precision: 0.6037
- Recall: 0.6491
- F1: 0.6256
- Accuracy: 0.9313
- Adr Precision: 0.5441
- Adr Recall: 0.6103
- Adr F1: 0.5753
- Disease Precision: 0.5
- Disease Recall: 0.375
- Disease F1: 0.4286
- Drug Precision: 0.8649
- Drug Recall: 0.8889
- Drug F1: 0.8767
- Finding Precision: 0.2903
- Finding Recall: 0.2812
- Finding F1: 0.2857
- Symptom Precision: 0.4839
- Symptom Recall: 0.5172
- Symptom F1: 0.5000
- Macro Avg F1: 0.5333
- Weighted Avg F1: 0.6256

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 35

### Training results

| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1     | Accuracy | Adr Precision | Adr Recall | Adr F1 | Disease Precision | Disease Recall | Disease F1 | Drug Precision | Drug Recall | Drug F1 | Finding Precision | Finding Recall | Finding F1 | Symptom Precision | Symptom Recall | Symptom F1 | Macro Avg F1 | Weighted Avg F1 |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|:-------------:|:----------:|:------:|:-----------------:|:--------------:|:----------:|:--------------:|:-----------:|:-------:|:-----------------:|:--------------:|:----------:|:-----------------:|:--------------:|:----------:|:------------:|:---------------:|
| No log        | 1.0   | 125  | 0.2340          | 0.5044    | 0.6003 | 0.5482 | 0.9191   | 0.4397        | 0.5938     | 0.5053 | 0.3529            | 0.375          | 0.3636     | 0.7574         | 0.85        | 0.8010  | 0.1818            | 0.0625         | 0.0930     | 0.0               | 0.0            | 0.0        | 0.3526       | 0.5328          |
| No log        | 2.0   | 250  | 0.2068          | 0.5546    | 0.6227 | 0.5867 | 0.9253   | 0.4770        | 0.6        | 0.5315 | 0.55              | 0.3438         | 0.4231     | 0.8256         | 0.8944      | 0.8587  | 0.3158            | 0.1875         | 0.2353     | 0.4286            | 0.1034         | 0.1667     | 0.4430       | 0.5782          |
| No log        | 3.0   | 375  | 0.2031          | 0.5633    | 0.6161 | 0.5885 | 0.9281   | 0.5150        | 0.5670     | 0.5397 | 0.5               | 0.4062         | 0.4483     | 0.8093         | 0.8722      | 0.8396  | 0.2               | 0.2188         | 0.2090     | 0.375             | 0.5172         | 0.4348     | 0.4943       | 0.5891          |
| 0.209         | 4.0   | 500  | 0.2331          | 0.5483    | 0.6069 | 0.5761 | 0.9273   | 0.5009        | 0.5897     | 0.5417 | 0.0               | 0.0            | 0.0        | 0.8404         | 0.8778      | 0.8587  | 0.14              | 0.2188         | 0.1707     | 0.5               | 0.3103         | 0.3830     | 0.3908       | 0.5724          |
| 0.209         | 5.0   | 625  | 0.2376          | 0.5878    | 0.6491 | 0.6169 | 0.9324   | 0.5129        | 0.6165     | 0.5599 | 0.5312            | 0.5312         | 0.5312     | 0.8703         | 0.8944      | 0.8822  | 0.1429            | 0.0625         | 0.0870     | 0.5652            | 0.4483         | 0.5000     | 0.5121       | 0.6130          |
| 0.209         | 6.0   | 750  | 0.2523          | 0.5646    | 0.6346 | 0.5975 | 0.9258   | 0.5114        | 0.6021     | 0.5530 | 0.4               | 0.375          | 0.3871     | 0.8649         | 0.8889      | 0.8767  | 0.0857            | 0.0938         | 0.0896     | 0.4516            | 0.4828         | 0.4667     | 0.4746       | 0.6000          |
| 0.209         | 7.0   | 875  | 0.2753          | 0.5748    | 0.6438 | 0.6073 | 0.9249   | 0.5209        | 0.6165     | 0.5647 | 0.4762            | 0.3125         | 0.3774     | 0.8670         | 0.9056      | 0.8859  | 0.1458            | 0.2188         | 0.1750     | 0.5               | 0.3103         | 0.3830     | 0.4772       | 0.6096          |
| 0.0561        | 8.0   | 1000 | 0.2769          | 0.5868    | 0.6557 | 0.6193 | 0.9284   | 0.5288        | 0.6247     | 0.5728 | 0.6               | 0.375          | 0.4615     | 0.8703         | 0.8944      | 0.8822  | 0.2424            | 0.25           | 0.2462     | 0.3611            | 0.4483         | 0.4000     | 0.5125       | 0.6212          |
| 0.0561        | 9.0   | 1125 | 0.3161          | 0.5719    | 0.6240 | 0.5968 | 0.9281   | 0.5091        | 0.5794     | 0.5419 | 0.5263            | 0.3125         | 0.3922     | 0.8757         | 0.9         | 0.8877  | 0.1739            | 0.25           | 0.2051     | 0.48              | 0.4138         | 0.4444     | 0.4943       | 0.5998          |
| 0.0561        | 10.0  | 1250 | 0.3101          | 0.5867    | 0.6385 | 0.6115 | 0.9297   | 0.5343        | 0.5938     | 0.5625 | 0.4839            | 0.4688         | 0.4762     | 0.8791         | 0.8889      | 0.8840  | 0.1818            | 0.25           | 0.2105     | 0.4483            | 0.4483         | 0.4483     | 0.5163       | 0.6160          |
| 0.0561        | 11.0  | 1375 | 0.3321          | 0.5862    | 0.6412 | 0.6125 | 0.9295   | 0.5245        | 0.5959     | 0.5579 | 0.6               | 0.4688         | 0.5263     | 0.8556         | 0.8889      | 0.8719  | 0.2286            | 0.25           | 0.2388     | 0.4516            | 0.4828         | 0.4667     | 0.5323       | 0.6142          |
| 0.0206        | 12.0  | 1500 | 0.3459          | 0.5923    | 0.6517 | 0.6206 | 0.9303   | 0.5323        | 0.6124     | 0.5695 | 0.5517            | 0.5            | 0.5246     | 0.875          | 0.8944      | 0.8846  | 0.2581            | 0.25           | 0.2540     | 0.375             | 0.4138         | 0.3934     | 0.5252       | 0.6224          |
| 0.0206        | 13.0  | 1625 | 0.3489          | 0.5866    | 0.6214 | 0.6035 | 0.9270   | 0.5327        | 0.5876     | 0.5588 | 0.4667            | 0.4375         | 0.4516     | 0.8370         | 0.8556      | 0.8462  | 0.24              | 0.1875         | 0.2105     | 0.4138            | 0.4138         | 0.4138     | 0.4962       | 0.6023          |
| 0.0206        | 14.0  | 1750 | 0.3762          | 0.5709    | 0.6214 | 0.5951 | 0.9270   | 0.5047        | 0.5588     | 0.5303 | 0.5               | 0.4375         | 0.4667     | 0.8811         | 0.9056      | 0.8932  | 0.2143            | 0.2812         | 0.2432     | 0.4242            | 0.4828         | 0.4516     | 0.5170       | 0.5987          |
| 0.0206        | 15.0  | 1875 | 0.3729          | 0.5806    | 0.6412 | 0.6094 | 0.9280   | 0.5149        | 0.6041     | 0.5560 | 0.5652            | 0.4062         | 0.4727     | 0.8503         | 0.8833      | 0.8665  | 0.3               | 0.2812         | 0.2903     | 0.4286            | 0.4138         | 0.4211     | 0.5213       | 0.6098          |
| 0.0093        | 16.0  | 2000 | 0.3980          | 0.5748    | 0.6385 | 0.6050 | 0.9265   | 0.5229        | 0.6124     | 0.5641 | 0.4762            | 0.3125         | 0.3774     | 0.8525         | 0.8667      | 0.8595  | 0.2326            | 0.3125         | 0.2667     | 0.4074            | 0.3793         | 0.3929     | 0.4921       | 0.6073          |
| 0.0093        | 17.0  | 2125 | 0.3885          | 0.5951    | 0.6359 | 0.6148 | 0.9285   | 0.5343        | 0.5938     | 0.5625 | 0.6087            | 0.4375         | 0.5091     | 0.8587         | 0.8778      | 0.8681  | 0.25              | 0.25           | 0.25       | 0.4375            | 0.4828         | 0.4590     | 0.5297       | 0.6157          |
| 0.0093        | 18.0  | 2250 | 0.4024          | 0.6015    | 0.6491 | 0.6244 | 0.9310   | 0.5368        | 0.6021     | 0.5675 | 0.5               | 0.4375         | 0.4667     | 0.8811         | 0.9056      | 0.8932  | 0.2857            | 0.25           | 0.2667     | 0.4545            | 0.5172         | 0.4839     | 0.5356       | 0.6247          |
| 0.0093        | 19.0  | 2375 | 0.4019          | 0.6025    | 0.6478 | 0.6243 | 0.9302   | 0.5399        | 0.6        | 0.5684 | 0.5714            | 0.5            | 0.5333     | 0.8703         | 0.8944      | 0.8822  | 0.2667            | 0.25           | 0.2581     | 0.4545            | 0.5172         | 0.4839     | 0.5452       | 0.6251          |
| 0.0053        | 20.0  | 2500 | 0.4061          | 0.5847    | 0.6332 | 0.6080 | 0.9291   | 0.5268        | 0.5876     | 0.5556 | 0.5652            | 0.4062         | 0.4727     | 0.8595         | 0.8833      | 0.8712  | 0.2286            | 0.25           | 0.2388     | 0.4054            | 0.5172         | 0.4545     | 0.5186       | 0.6098          |
| 0.0053        | 21.0  | 2625 | 0.4219          | 0.5903    | 0.6425 | 0.6153 | 0.9288   | 0.5213        | 0.6062     | 0.5605 | 0.55              | 0.3438         | 0.4231     | 0.8587         | 0.8778      | 0.8681  | 0.3103            | 0.2812         | 0.2951     | 0.5357            | 0.5172         | 0.5263     | 0.5346       | 0.6153          |
| 0.0053        | 22.0  | 2750 | 0.4190          | 0.6024    | 0.6557 | 0.6279 | 0.9309   | 0.5420        | 0.6247     | 0.5805 | 0.5185            | 0.4375         | 0.4746     | 0.8548         | 0.8833      | 0.8689  | 0.32              | 0.25           | 0.2807     | 0.4643            | 0.4483         | 0.4561     | 0.5321       | 0.6271          |
| 0.0053        | 23.0  | 2875 | 0.4272          | 0.5870    | 0.6412 | 0.6129 | 0.9287   | 0.5192        | 0.5856     | 0.5504 | 0.6154            | 0.5            | 0.5517     | 0.8610         | 0.8944      | 0.8774  | 0.2564            | 0.3125         | 0.2817     | 0.5172            | 0.5172         | 0.5172     | 0.5557       | 0.6155          |
| 0.0034        | 24.0  | 3000 | 0.4206          | 0.5887    | 0.6438 | 0.6150 | 0.9308   | 0.5160        | 0.6        | 0.5548 | 0.5769            | 0.4688         | 0.5172     | 0.8602         | 0.8889      | 0.8743  | 0.2963            | 0.25           | 0.2712     | 0.5385            | 0.4828         | 0.5091     | 0.5453       | 0.6154          |
| 0.0034        | 25.0  | 3125 | 0.4260          | 0.6037    | 0.6491 | 0.6256 | 0.9309   | 0.5365        | 0.6062     | 0.5692 | 0.52              | 0.4062         | 0.4561     | 0.8859         | 0.9056      | 0.8956  | 0.2692            | 0.2188         | 0.2414     | 0.4688            | 0.5172         | 0.4918     | 0.5308       | 0.6251          |
| 0.0034        | 26.0  | 3250 | 0.4341          | 0.5995    | 0.6478 | 0.6227 | 0.9310   | 0.5307        | 0.6062     | 0.5659 | 0.5417            | 0.4062         | 0.4643     | 0.8710         | 0.9         | 0.8852  | 0.2857            | 0.25           | 0.2667     | 0.5185            | 0.4828         | 0.5        | 0.5364       | 0.6223          |
| 0.0034        | 27.0  | 3375 | 0.4476          | 0.6010    | 0.6438 | 0.6217 | 0.9300   | 0.5314        | 0.5938     | 0.5609 | 0.56              | 0.4375         | 0.4912     | 0.8710         | 0.9         | 0.8852  | 0.3               | 0.2812         | 0.2903     | 0.5172            | 0.5172         | 0.5172     | 0.5490       | 0.6219          |
| 0.0025        | 28.0  | 3500 | 0.4281          | 0.6010    | 0.6478 | 0.6235 | 0.9299   | 0.5328        | 0.6021     | 0.5653 | 0.56              | 0.4375         | 0.4912     | 0.8663         | 0.9         | 0.8828  | 0.2667            | 0.25           | 0.2581     | 0.5556            | 0.5172         | 0.5357     | 0.5466       | 0.6235          |
| 0.0025        | 29.0  | 3625 | 0.4339          | 0.5988    | 0.6438 | 0.6205 | 0.9299   | 0.5378        | 0.6021     | 0.5681 | 0.52              | 0.4062         | 0.4561     | 0.8595         | 0.8833      | 0.8712  | 0.2903            | 0.2812         | 0.2857     | 0.4839            | 0.5172         | 0.5000     | 0.5362       | 0.6208          |
| 0.0025        | 30.0  | 3750 | 0.4408          | 0.6105    | 0.6596 | 0.6341 | 0.9311   | 0.5404        | 0.6206     | 0.5777 | 0.5909            | 0.4062         | 0.4815     | 0.8663         | 0.9         | 0.8828  | 0.36              | 0.2812         | 0.3158     | 0.5357            | 0.5172         | 0.5263     | 0.5568       | 0.6331          |
| 0.0025        | 31.0  | 3875 | 0.4450          | 0.6079    | 0.6504 | 0.6284 | 0.9309   | 0.5410        | 0.6124     | 0.5745 | 0.5417            | 0.4062         | 0.4643     | 0.8656         | 0.8944      | 0.8798  | 0.2917            | 0.2188         | 0.25       | 0.5357            | 0.5172         | 0.5263     | 0.5390       | 0.6268          |
| 0.0016        | 32.0  | 4000 | 0.4435          | 0.5988    | 0.6359 | 0.6168 | 0.9305   | 0.5345        | 0.5918     | 0.5616 | 0.52              | 0.4062         | 0.4561     | 0.8641         | 0.8833      | 0.8736  | 0.2857            | 0.25           | 0.2667     | 0.4839            | 0.5172         | 0.5000     | 0.5316       | 0.6165          |
| 0.0016        | 33.0  | 4125 | 0.4448          | 0.6017    | 0.6438 | 0.6221 | 0.9308   | 0.5369        | 0.6        | 0.5667 | 0.5417            | 0.4062         | 0.4643     | 0.8696         | 0.8889      | 0.8791  | 0.3103            | 0.2812         | 0.2951     | 0.4688            | 0.5172         | 0.4918     | 0.5394       | 0.6222          |
| 0.0016        | 34.0  | 4250 | 0.4459          | 0.6030    | 0.6451 | 0.6233 | 0.9304   | 0.5436        | 0.6041     | 0.5723 | 0.5               | 0.375          | 0.4286     | 0.8649         | 0.8889      | 0.8767  | 0.2812            | 0.2812         | 0.2812     | 0.4839            | 0.5172         | 0.5000     | 0.5318       | 0.6234          |
| 0.0016        | 35.0  | 4375 | 0.4487          | 0.6037    | 0.6491 | 0.6256 | 0.9313   | 0.5441        | 0.6103     | 0.5753 | 0.5               | 0.375          | 0.4286     | 0.8649         | 0.8889      | 0.8767  | 0.2903            | 0.2812         | 0.2857     | 0.4839            | 0.5172         | 0.5000     | 0.5333       | 0.6256          |


### Framework versions

- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.15.0
- Tokenizers 0.15.0