ner-coin / README.md
thanhdath's picture
Model save
a28ef83 verified
|
raw
history blame
10.7 kB
metadata
license: mit
base_model: microsoft/Multilingual-MiniLM-L12-H384
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - f1
  - accuracy
model-index:
  - name: ner-coin
    results: []

ner-coin

This model is a fine-tuned version of microsoft/Multilingual-MiniLM-L12-H384 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0015
  • Precision: 1.0
  • Recall: 1.0
  • F1: 1.0
  • Accuracy: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 27 0.1989 0.0 0.0 0.0 0.9989
No log 2.0 54 0.1611 0.0 0.0 0.0 0.9989
No log 3.0 81 0.1334 0.0 0.0 0.0 0.9989
No log 4.0 108 0.1116 0.0 0.0 0.0 0.9989
No log 5.0 135 0.0943 0.0 0.0 0.0 0.9989
No log 6.0 162 0.0804 0.0 0.0 0.0 0.9989
No log 7.0 189 0.0692 0.0 0.0 0.0 0.9989
No log 8.0 216 0.0602 0.0 0.0 0.0 0.9989
No log 9.0 243 0.0528 0.0 0.0 0.0 0.9989
No log 10.0 270 0.0468 0.0 0.0 0.0 0.9989
No log 11.0 297 0.0418 0.0 0.0 0.0 0.9989
No log 12.0 324 0.0376 0.0 0.0 0.0 0.9989
No log 13.0 351 0.0341 0.0 0.0 0.0 0.9989
No log 14.0 378 0.0312 0.0 0.0 0.0 0.9989
No log 15.0 405 0.0287 0.0 0.0 0.0 0.9989
No log 16.0 432 0.0266 0.0 0.0 0.0 0.9989
No log 17.0 459 0.0247 0.0 0.0 0.0 0.9989
No log 18.0 486 0.0236 0.0 0.0 0.0 0.9989
0.0904 19.0 513 0.0218 0.0 0.0 0.0 0.9989
0.0904 20.0 540 0.0203 0.0 0.0 0.0 0.9989
0.0904 21.0 567 0.0156 0.8571 0.8571 0.8571 0.9997
0.0904 22.0 594 0.0142 1.0 0.8571 0.9231 0.9998
0.0904 23.0 621 0.0133 1.0 0.8571 0.9231 0.9998
0.0904 24.0 648 0.0122 1.0 0.8571 0.9231 0.9998
0.0904 25.0 675 0.0107 1.0 1.0 1.0 1.0
0.0904 26.0 702 0.0099 1.0 1.0 1.0 1.0
0.0904 27.0 729 0.0092 1.0 1.0 1.0 1.0
0.0904 28.0 756 0.0086 1.0 1.0 1.0 1.0
0.0904 29.0 783 0.0081 1.0 1.0 1.0 1.0
0.0904 30.0 810 0.0076 1.0 1.0 1.0 1.0
0.0904 31.0 837 0.0074 1.0 1.0 1.0 0.9998
0.0904 32.0 864 0.0073 1.0 0.8571 0.9231 0.9998
0.0904 33.0 891 0.0064 1.0 1.0 1.0 1.0
0.0904 34.0 918 0.0061 1.0 1.0 1.0 1.0
0.0904 35.0 945 0.0058 1.0 1.0 1.0 1.0
0.0904 36.0 972 0.0055 1.0 1.0 1.0 1.0
0.0904 37.0 999 0.0053 1.0 1.0 1.0 1.0
0.0122 38.0 1026 0.0050 1.0 1.0 1.0 1.0
0.0122 39.0 1053 0.0048 1.0 1.0 1.0 1.0
0.0122 40.0 1080 0.0046 1.0 1.0 1.0 1.0
0.0122 41.0 1107 0.0044 1.0 1.0 1.0 1.0
0.0122 42.0 1134 0.0042 1.0 1.0 1.0 1.0
0.0122 43.0 1161 0.0041 1.0 1.0 1.0 1.0
0.0122 44.0 1188 0.0039 1.0 1.0 1.0 1.0
0.0122 45.0 1215 0.0038 1.0 1.0 1.0 1.0
0.0122 46.0 1242 0.0036 1.0 1.0 1.0 1.0
0.0122 47.0 1269 0.0035 1.0 1.0 1.0 1.0
0.0122 48.0 1296 0.0034 1.0 1.0 1.0 1.0
0.0122 49.0 1323 0.0033 1.0 1.0 1.0 1.0
0.0122 50.0 1350 0.0032 1.0 1.0 1.0 1.0
0.0122 51.0 1377 0.0031 1.0 1.0 1.0 1.0
0.0122 52.0 1404 0.0030 1.0 1.0 1.0 1.0
0.0122 53.0 1431 0.0029 1.0 1.0 1.0 1.0
0.0122 54.0 1458 0.0029 1.0 1.0 1.0 1.0
0.0122 55.0 1485 0.0028 1.0 1.0 1.0 1.0
0.0042 56.0 1512 0.0027 1.0 1.0 1.0 1.0
0.0042 57.0 1539 0.0026 1.0 1.0 1.0 1.0
0.0042 58.0 1566 0.0026 1.0 1.0 1.0 1.0
0.0042 59.0 1593 0.0025 1.0 1.0 1.0 1.0
0.0042 60.0 1620 0.0024 1.0 1.0 1.0 1.0
0.0042 61.0 1647 0.0024 1.0 1.0 1.0 1.0
0.0042 62.0 1674 0.0023 1.0 1.0 1.0 1.0
0.0042 63.0 1701 0.0023 1.0 1.0 1.0 1.0
0.0042 64.0 1728 0.0022 1.0 1.0 1.0 1.0
0.0042 65.0 1755 0.0022 1.0 1.0 1.0 1.0
0.0042 66.0 1782 0.0021 1.0 1.0 1.0 1.0
0.0042 67.0 1809 0.0021 1.0 1.0 1.0 1.0
0.0042 68.0 1836 0.0021 1.0 1.0 1.0 1.0
0.0042 69.0 1863 0.0020 1.0 1.0 1.0 1.0
0.0042 70.0 1890 0.0020 1.0 1.0 1.0 1.0
0.0042 71.0 1917 0.0020 1.0 1.0 1.0 1.0
0.0042 72.0 1944 0.0019 1.0 1.0 1.0 1.0
0.0042 73.0 1971 0.0019 1.0 1.0 1.0 1.0
0.0042 74.0 1998 0.0019 1.0 1.0 1.0 1.0
0.0025 75.0 2025 0.0018 1.0 1.0 1.0 1.0
0.0025 76.0 2052 0.0018 1.0 1.0 1.0 1.0
0.0025 77.0 2079 0.0018 1.0 1.0 1.0 1.0
0.0025 78.0 2106 0.0018 1.0 1.0 1.0 1.0
0.0025 79.0 2133 0.0017 1.0 1.0 1.0 1.0
0.0025 80.0 2160 0.0017 1.0 1.0 1.0 1.0
0.0025 81.0 2187 0.0017 1.0 1.0 1.0 1.0
0.0025 82.0 2214 0.0017 1.0 1.0 1.0 1.0
0.0025 83.0 2241 0.0017 1.0 1.0 1.0 1.0
0.0025 84.0 2268 0.0016 1.0 1.0 1.0 1.0
0.0025 85.0 2295 0.0016 1.0 1.0 1.0 1.0
0.0025 86.0 2322 0.0016 1.0 1.0 1.0 1.0
0.0025 87.0 2349 0.0016 1.0 1.0 1.0 1.0
0.0025 88.0 2376 0.0016 1.0 1.0 1.0 1.0
0.0025 89.0 2403 0.0016 1.0 1.0 1.0 1.0
0.0025 90.0 2430 0.0016 1.0 1.0 1.0 1.0
0.0025 91.0 2457 0.0016 1.0 1.0 1.0 1.0
0.0025 92.0 2484 0.0016 1.0 1.0 1.0 1.0
0.0019 93.0 2511 0.0016 1.0 1.0 1.0 1.0
0.0019 94.0 2538 0.0016 1.0 1.0 1.0 1.0
0.0019 95.0 2565 0.0015 1.0 1.0 1.0 1.0
0.0019 96.0 2592 0.0015 1.0 1.0 1.0 1.0
0.0019 97.0 2619 0.0015 1.0 1.0 1.0 1.0
0.0019 98.0 2646 0.0015 1.0 1.0 1.0 1.0
0.0019 99.0 2673 0.0015 1.0 1.0 1.0 1.0
0.0019 100.0 2700 0.0015 1.0 1.0 1.0 1.0

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.14.5
  • Tokenizers 0.19.1