simonycl's picture
update model card README.md
5840ee2
metadata
license: apache-2.0
base_model: bert-base-uncased
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: best_model-sst-2-16-87
    results: []

best_model-sst-2-16-87

This model is a fine-tuned version of bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6392
  • Accuracy: 0.875

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 150

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 1 0.5816 0.8438
No log 2.0 2 0.5813 0.8438
No log 3.0 3 0.5807 0.8438
No log 4.0 4 0.5798 0.8438
No log 5.0 5 0.5786 0.8438
No log 6.0 6 0.5770 0.8438
No log 7.0 7 0.5750 0.8438
No log 8.0 8 0.5726 0.8438
No log 9.0 9 0.5701 0.8438
0.4546 10.0 10 0.5672 0.8438
0.4546 11.0 11 0.5641 0.8438
0.4546 12.0 12 0.5614 0.8438
0.4546 13.0 13 0.5586 0.8438
0.4546 14.0 14 0.5560 0.8438
0.4546 15.0 15 0.5530 0.8438
0.4546 16.0 16 0.5501 0.8438
0.4546 17.0 17 0.5470 0.8438
0.4546 18.0 18 0.5438 0.8438
0.4546 19.0 19 0.5407 0.8438
0.4413 20.0 20 0.5369 0.8438
0.4413 21.0 21 0.5325 0.8438
0.4413 22.0 22 0.5280 0.8438
0.4413 23.0 23 0.5230 0.8438
0.4413 24.0 24 0.5180 0.8438
0.4413 25.0 25 0.5132 0.8438
0.4413 26.0 26 0.5088 0.8438
0.4413 27.0 27 0.5049 0.8438
0.4413 28.0 28 0.5014 0.8438
0.4413 29.0 29 0.4985 0.8438
0.3899 30.0 30 0.4964 0.8438
0.3899 31.0 31 0.4951 0.8438
0.3899 32.0 32 0.4937 0.8438
0.3899 33.0 33 0.4919 0.8438
0.3899 34.0 34 0.4902 0.8438
0.3899 35.0 35 0.4884 0.8438
0.3899 36.0 36 0.4870 0.8438
0.3899 37.0 37 0.4854 0.8438
0.3899 38.0 38 0.4844 0.8438
0.3899 39.0 39 0.4832 0.875
0.3672 40.0 40 0.4821 0.875
0.3672 41.0 41 0.4817 0.875
0.3672 42.0 42 0.4817 0.875
0.3672 43.0 43 0.4820 0.875
0.3672 44.0 44 0.4830 0.875
0.3672 45.0 45 0.4838 0.875
0.3672 46.0 46 0.4848 0.875
0.3672 47.0 47 0.4855 0.875
0.3672 48.0 48 0.4854 0.875
0.3672 49.0 49 0.4860 0.875
0.2765 50.0 50 0.4872 0.875
0.2765 51.0 51 0.4878 0.875
0.2765 52.0 52 0.4892 0.875
0.2765 53.0 53 0.4913 0.875
0.2765 54.0 54 0.4942 0.8438
0.2765 55.0 55 0.4977 0.8438
0.2765 56.0 56 0.5017 0.8438
0.2765 57.0 57 0.5074 0.8438
0.2765 58.0 58 0.5148 0.8438
0.2765 59.0 59 0.5211 0.8438
0.2106 60.0 60 0.5286 0.8438
0.2106 61.0 61 0.5361 0.8438
0.2106 62.0 62 0.5429 0.8438
0.2106 63.0 63 0.5497 0.8438
0.2106 64.0 64 0.5551 0.8438
0.2106 65.0 65 0.5569 0.8438
0.2106 66.0 66 0.5556 0.8438
0.2106 67.0 67 0.5522 0.8438
0.2106 68.0 68 0.5465 0.8438
0.2106 69.0 69 0.5400 0.8438
0.1587 70.0 70 0.5359 0.8438
0.1587 71.0 71 0.5311 0.8438
0.1587 72.0 72 0.5252 0.8438
0.1587 73.0 73 0.5217 0.8438
0.1587 74.0 74 0.5192 0.8438
0.1587 75.0 75 0.5158 0.8438
0.1587 76.0 76 0.5128 0.8438
0.1587 77.0 77 0.5113 0.8438
0.1587 78.0 78 0.5105 0.8438
0.1587 79.0 79 0.5091 0.8438
0.122 80.0 80 0.5090 0.8438
0.122 81.0 81 0.5100 0.8438
0.122 82.0 82 0.5126 0.8438
0.122 83.0 83 0.5167 0.8438
0.122 84.0 84 0.5215 0.8438
0.122 85.0 85 0.5274 0.8438
0.122 86.0 86 0.5351 0.8438
0.122 87.0 87 0.5439 0.8438
0.122 88.0 88 0.5547 0.8438
0.122 89.0 89 0.5658 0.8438
0.0738 90.0 90 0.5778 0.8438
0.0738 91.0 91 0.5872 0.8438
0.0738 92.0 92 0.5963 0.8438
0.0738 93.0 93 0.6027 0.8438
0.0738 94.0 94 0.6059 0.8438
0.0738 95.0 95 0.6070 0.8438
0.0738 96.0 96 0.6052 0.8438
0.0738 97.0 97 0.6020 0.8438
0.0738 98.0 98 0.5950 0.8438
0.0738 99.0 99 0.5870 0.8438
0.0328 100.0 100 0.5788 0.8438
0.0328 101.0 101 0.5706 0.8438
0.0328 102.0 102 0.5638 0.8438
0.0328 103.0 103 0.5578 0.8438
0.0328 104.0 104 0.5530 0.8438
0.0328 105.0 105 0.5491 0.875
0.0328 106.0 106 0.5465 0.875
0.0328 107.0 107 0.5457 0.875
0.0328 108.0 108 0.5456 0.875
0.0328 109.0 109 0.5462 0.875
0.0221 110.0 110 0.5473 0.875
0.0221 111.0 111 0.5486 0.875
0.0221 112.0 112 0.5500 0.875
0.0221 113.0 113 0.5521 0.875
0.0221 114.0 114 0.5543 0.875
0.0221 115.0 115 0.5564 0.875
0.0221 116.0 116 0.5589 0.875
0.0221 117.0 117 0.5613 0.875
0.0221 118.0 118 0.5637 0.875
0.0221 119.0 119 0.5660 0.875
0.017 120.0 120 0.5682 0.875
0.017 121.0 121 0.5704 0.875
0.017 122.0 122 0.5727 0.875
0.017 123.0 123 0.5748 0.875
0.017 124.0 124 0.5772 0.875
0.017 125.0 125 0.5796 0.875
0.017 126.0 126 0.5820 0.875
0.017 127.0 127 0.5847 0.875
0.017 128.0 128 0.5874 0.875
0.017 129.0 129 0.5900 0.875
0.0129 130.0 130 0.5926 0.875
0.0129 131.0 131 0.5951 0.875
0.0129 132.0 132 0.5976 0.875
0.0129 133.0 133 0.6001 0.875
0.0129 134.0 134 0.6027 0.875
0.0129 135.0 135 0.6051 0.875
0.0129 136.0 136 0.6076 0.875
0.0129 137.0 137 0.6099 0.875
0.0129 138.0 138 0.6123 0.875
0.0129 139.0 139 0.6146 0.875
0.0103 140.0 140 0.6169 0.875
0.0103 141.0 141 0.6192 0.875
0.0103 142.0 142 0.6216 0.875
0.0103 143.0 143 0.6239 0.875
0.0103 144.0 144 0.6261 0.875
0.0103 145.0 145 0.6284 0.875
0.0103 146.0 146 0.6306 0.875
0.0103 147.0 147 0.6328 0.875
0.0103 148.0 148 0.6350 0.875
0.0103 149.0 149 0.6371 0.875
0.0084 150.0 150 0.6392 0.875

Framework versions

  • Transformers 4.32.0.dev0
  • Pytorch 2.0.1+cu118
  • Datasets 2.4.0
  • Tokenizers 0.13.3