simonycl's picture
update model card README.md
00e7bb3
metadata
license: apache-2.0
base_model: bert-base-uncased
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: best_model-sst-2-16-100
    results: []

best_model-sst-2-16-100

This model is a fine-tuned version of bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4760
  • Accuracy: 0.9062

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 150

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 1 0.3957 0.875
No log 2.0 2 0.3958 0.875
No log 3.0 3 0.3961 0.875
No log 4.0 4 0.3964 0.875
No log 5.0 5 0.3968 0.875
No log 6.0 6 0.3971 0.875
No log 7.0 7 0.3974 0.875
No log 8.0 8 0.3976 0.875
No log 9.0 9 0.3978 0.875
0.2951 10.0 10 0.3979 0.875
0.2951 11.0 11 0.3977 0.875
0.2951 12.0 12 0.3971 0.875
0.2951 13.0 13 0.3963 0.875
0.2951 14.0 14 0.3954 0.875
0.2951 15.0 15 0.3943 0.875
0.2951 16.0 16 0.3929 0.875
0.2951 17.0 17 0.3912 0.875
0.2951 18.0 18 0.3895 0.875
0.2951 19.0 19 0.3876 0.875
0.2889 20.0 20 0.3854 0.875
0.2889 21.0 21 0.3830 0.875
0.2889 22.0 22 0.3806 0.875
0.2889 23.0 23 0.3789 0.875
0.2889 24.0 24 0.3770 0.875
0.2889 25.0 25 0.3755 0.9062
0.2889 26.0 26 0.3739 0.9062
0.2889 27.0 27 0.3728 0.9062
0.2889 28.0 28 0.3716 0.9062
0.2889 29.0 29 0.3704 0.9062
0.2147 30.0 30 0.3697 0.9062
0.2147 31.0 31 0.3692 0.9062
0.2147 32.0 32 0.3688 0.9062
0.2147 33.0 33 0.3686 0.9062
0.2147 34.0 34 0.3684 0.9062
0.2147 35.0 35 0.3683 0.9062
0.2147 36.0 36 0.3682 0.9062
0.2147 37.0 37 0.3684 0.9062
0.2147 38.0 38 0.3684 0.9062
0.2147 39.0 39 0.3685 0.9062
0.1272 40.0 40 0.3689 0.9062
0.1272 41.0 41 0.3693 0.9062
0.1272 42.0 42 0.3701 0.9062
0.1272 43.0 43 0.3709 0.875
0.1272 44.0 44 0.3719 0.875
0.1272 45.0 45 0.3728 0.875
0.1272 46.0 46 0.3731 0.875
0.1272 47.0 47 0.3728 0.875
0.1272 48.0 48 0.3729 0.875
0.1272 49.0 49 0.3726 0.875
0.0531 50.0 50 0.3726 0.875
0.0531 51.0 51 0.3721 0.875
0.0531 52.0 52 0.3716 0.875
0.0531 53.0 53 0.3715 0.875
0.0531 54.0 54 0.3707 0.875
0.0531 55.0 55 0.3706 0.875
0.0531 56.0 56 0.3702 0.875
0.0531 57.0 57 0.3707 0.875
0.0531 58.0 58 0.3716 0.875
0.0531 59.0 59 0.3735 0.875
0.0221 60.0 60 0.3754 0.875
0.0221 61.0 61 0.3775 0.875
0.0221 62.0 62 0.3801 0.875
0.0221 63.0 63 0.3824 0.875
0.0221 64.0 64 0.3847 0.875
0.0221 65.0 65 0.3871 0.875
0.0221 66.0 66 0.3883 0.875
0.0221 67.0 67 0.3885 0.875
0.0221 68.0 68 0.3886 0.875
0.0221 69.0 69 0.3876 0.875
0.0151 70.0 70 0.3869 0.875
0.0151 71.0 71 0.3869 0.875
0.0151 72.0 72 0.3871 0.875
0.0151 73.0 73 0.3875 0.875
0.0151 74.0 74 0.3872 0.875
0.0151 75.0 75 0.3873 0.875
0.0151 76.0 76 0.3869 0.875
0.0151 77.0 77 0.3868 0.875
0.0151 78.0 78 0.3876 0.9062
0.0151 79.0 79 0.3885 0.9062
0.0099 80.0 80 0.3896 0.9062
0.0099 81.0 81 0.3908 0.9062
0.0099 82.0 82 0.3921 0.9062
0.0099 83.0 83 0.3935 0.9062
0.0099 84.0 84 0.3952 0.9062
0.0099 85.0 85 0.3972 0.9062
0.0099 86.0 86 0.3992 0.9062
0.0099 87.0 87 0.4017 0.9062
0.0099 88.0 88 0.4042 0.9062
0.0099 89.0 89 0.4062 0.9062
0.0074 90.0 90 0.4082 0.9062
0.0074 91.0 91 0.4100 0.9062
0.0074 92.0 92 0.4118 0.9062
0.0074 93.0 93 0.4135 0.9062
0.0074 94.0 94 0.4152 0.9062
0.0074 95.0 95 0.4169 0.9062
0.0074 96.0 96 0.4185 0.9062
0.0074 97.0 97 0.4198 0.9062
0.0074 98.0 98 0.4211 0.9062
0.0074 99.0 99 0.4224 0.9062
0.006 100.0 100 0.4236 0.9062
0.006 101.0 101 0.4248 0.9062
0.006 102.0 102 0.4259 0.9062
0.006 103.0 103 0.4271 0.9062
0.006 104.0 104 0.4284 0.9062
0.006 105.0 105 0.4296 0.9062
0.006 106.0 106 0.4298 0.9062
0.006 107.0 107 0.4283 0.9062
0.006 108.0 108 0.4276 0.9062
0.006 109.0 109 0.4275 0.9062
0.0065 110.0 110 0.4280 0.9062
0.0065 111.0 111 0.4287 0.9062
0.0065 112.0 112 0.4297 0.9062
0.0065 113.0 113 0.4309 0.9062
0.0065 114.0 114 0.4322 0.9062
0.0065 115.0 115 0.4337 0.9062
0.0065 116.0 116 0.4352 0.9062
0.0065 117.0 117 0.4367 0.9062
0.0065 118.0 118 0.4383 0.9062
0.0065 119.0 119 0.4399 0.9062
0.0046 120.0 120 0.4413 0.9062
0.0046 121.0 121 0.4428 0.9062
0.0046 122.0 122 0.4443 0.9062
0.0046 123.0 123 0.4457 0.9062
0.0046 124.0 124 0.4470 0.9062
0.0046 125.0 125 0.4483 0.9062
0.0046 126.0 126 0.4495 0.9062
0.0046 127.0 127 0.4508 0.9062
0.0046 128.0 128 0.4520 0.9062
0.0046 129.0 129 0.4531 0.9062
0.0037 130.0 130 0.4543 0.9062
0.0037 131.0 131 0.4555 0.9062
0.0037 132.0 132 0.4566 0.9062
0.0037 133.0 133 0.4577 0.9062
0.0037 134.0 134 0.4588 0.9062
0.0037 135.0 135 0.4599 0.9062
0.0037 136.0 136 0.4610 0.9062
0.0037 137.0 137 0.4622 0.9062
0.0037 138.0 138 0.4633 0.9062
0.0037 139.0 139 0.4644 0.9062
0.0033 140.0 140 0.4655 0.9062
0.0033 141.0 141 0.4666 0.9062
0.0033 142.0 142 0.4677 0.9062
0.0033 143.0 143 0.4688 0.9062
0.0033 144.0 144 0.4700 0.9062
0.0033 145.0 145 0.4712 0.9062
0.0033 146.0 146 0.4725 0.9062
0.0033 147.0 147 0.4733 0.9062
0.0033 148.0 148 0.4742 0.9062
0.0033 149.0 149 0.4751 0.9062
0.0029 150.0 150 0.4760 0.9062

Framework versions

  • Transformers 4.32.0.dev0
  • Pytorch 2.0.1+cu118
  • Datasets 2.4.0
  • Tokenizers 0.13.3