Edit model card

bert-large-uncased_winobias_finetuned

This model is a fine-tuned version of bert-large-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4783
  • Accuracy: 0.7986

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 128
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.38 5 0.7011 0.4994
No log 0.77 10 0.6942 0.4987
No log 1.15 15 0.6941 0.5063
No log 1.54 20 0.6936 0.4924
No log 1.92 25 0.6928 0.5114
No log 2.31 30 0.6925 0.5196
No log 2.69 35 0.6925 0.5215
No log 3.08 40 0.6923 0.5227
No log 3.46 45 0.6922 0.5259
No log 3.85 50 0.6922 0.5202
No log 4.23 55 0.6918 0.5316
No log 4.62 60 0.6912 0.5499
No log 5.0 65 0.6904 0.5574
No log 5.38 70 0.6899 0.5492
No log 5.77 75 0.6894 0.5417
No log 6.15 80 0.6890 0.5290
No log 6.54 85 0.6883 0.5366
No log 6.92 90 0.6863 0.5726
No log 7.31 95 0.6837 0.5909
No log 7.69 100 0.6812 0.5890
No log 8.08 105 0.6788 0.5915
No log 8.46 110 0.6738 0.6225
No log 8.85 115 0.6685 0.6503
No log 9.23 120 0.6616 0.6698
No log 9.62 125 0.6533 0.6799
No log 10.0 130 0.6403 0.7027
No log 10.38 135 0.6282 0.7077
No log 10.77 140 0.6142 0.7235
No log 11.15 145 0.5967 0.7355
No log 11.54 150 0.5814 0.7437
No log 11.92 155 0.5662 0.7513
No log 12.31 160 0.5454 0.7607
No log 12.69 165 0.5251 0.7771
No log 13.08 170 0.5091 0.7872
No log 13.46 175 0.4975 0.7942
No log 13.85 180 0.4892 0.7967
No log 14.23 185 0.4832 0.7992
No log 14.62 190 0.4797 0.8005
No log 15.0 195 0.4783 0.7986

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.1
  • Datasets 2.10.1
  • Tokenizers 0.13.2
Downloads last month
10
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.