v3b_mistral_lora / README.md
mtzig's picture
Model save
3920494 verified
metadata
library_name: peft
base_model: peiyi9979/math-shepherd-mistral-7b-prm
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - precision
  - recall
  - f1
model-index:
  - name: v3b_mistral_lora
    results: []

v3b_mistral_lora

This model is a fine-tuned version of peiyi9979/math-shepherd-mistral-7b-prm on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2975
  • Accuracy: 0.8647
  • Precision: 0.8701
  • Recall: 0.6087
  • F1: 0.7163

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 765837
  • distributed_type: multi-GPU
  • num_devices: 4
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 64
  • total_eval_batch_size: 32
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1
No log 0 0 0.6026 0.7339 0.6 0.1542 0.2453
0.5256 0.0095 20 0.6015 0.7350 0.6061 0.1581 0.2508
0.6118 0.0189 40 0.5988 0.7361 0.6087 0.1660 0.2609
0.5575 0.0284 60 0.5849 0.7450 0.6456 0.2016 0.3072
0.6385 0.0378 80 0.5648 0.7461 0.5938 0.3004 0.3990
0.4791 0.0473 100 0.5396 0.7661 0.6694 0.3281 0.4403
0.3593 0.0567 120 0.5030 0.7794 0.7109 0.3597 0.4777
0.4435 0.0662 140 0.4716 0.7794 0.6467 0.4704 0.5446
0.3899 0.0757 160 0.4403 0.7938 0.7519 0.3953 0.5181
0.3429 0.0851 180 0.4055 0.8160 0.7771 0.4822 0.5951
0.3529 0.0946 200 0.3847 0.8182 0.7405 0.5415 0.6256
0.36 0.1040 220 0.3824 0.8182 0.7697 0.5020 0.6077
0.2875 0.1135 240 0.3578 0.8226 0.7385 0.5692 0.6429
0.3237 0.1229 260 0.3457 0.8426 0.7342 0.6877 0.7102
0.2309 0.1324 280 0.3626 0.8204 0.8527 0.4348 0.5759
0.2843 0.1418 300 0.3511 0.8326 0.8493 0.4901 0.6216
0.2694 0.1513 320 0.3487 0.8337 0.8411 0.5020 0.6287
0.3854 0.1608 340 0.3573 0.8193 0.8358 0.4427 0.5788
0.3062 0.1702 360 0.3262 0.8470 0.7778 0.6364 0.7
0.2861 0.1797 380 0.3308 0.8459 0.8202 0.5771 0.6775
0.2808 0.1891 400 0.3584 0.8337 0.8931 0.4625 0.6094
0.2716 0.1986 420 0.3312 0.8525 0.8614 0.5652 0.6826
0.3696 0.2080 440 0.3196 0.8548 0.8020 0.6403 0.7121
0.1911 0.2175 460 0.3436 0.8426 0.8725 0.5138 0.6468
0.2548 0.2270 480 0.3311 0.8525 0.8704 0.5573 0.6795
0.2501 0.2364 500 0.3237 0.8481 0.8671 0.5415 0.6667
0.2936 0.2459 520 0.3496 0.8359 0.8832 0.4783 0.6205
0.2012 0.2553 540 0.3362 0.8404 0.8516 0.5217 0.6471
0.3295 0.2648 560 0.3415 0.8492 0.8462 0.5652 0.6777
0.2859 0.2742 580 0.3370 0.8437 0.8733 0.5178 0.6501
0.2655 0.2837 600 0.3248 0.8492 0.8343 0.5771 0.6822
0.2646 0.2931 620 0.3290 0.8481 0.8625 0.5455 0.6683
0.2706 0.3026 640 0.3193 0.8481 0.8222 0.5850 0.6836
0.2074 0.3121 660 0.3506 0.8470 0.8912 0.5178 0.655
0.2825 0.3215 680 0.3523 0.8282 0.8828 0.4466 0.5932
0.2718 0.3310 700 0.3708 0.8271 0.9008 0.4308 0.5829
0.2172 0.3404 720 0.3735 0.8237 0.9123 0.4111 0.5668
0.1876 0.3499 740 0.3519 0.8392 0.9154 0.4704 0.6214
0.2788 0.3593 760 0.3574 0.8348 0.8611 0.4901 0.6247
0.305 0.3688 780 0.3154 0.8581 0.8492 0.6008 0.7037
0.2726 0.3783 800 0.3149 0.8459 0.875 0.5257 0.6568
0.2819 0.3877 820 0.3015 0.8581 0.7880 0.6759 0.7277
0.2596 0.3972 840 0.3099 0.8548 0.7629 0.6996 0.7299
0.185 0.4066 860 0.3079 0.8614 0.8299 0.6364 0.7204
0.189 0.4161 880 0.3248 0.8503 0.8882 0.5336 0.6667
0.299 0.4255 900 0.3174 0.8525 0.8614 0.5652 0.6826
0.199 0.4350 920 0.3387 0.8392 0.9030 0.4783 0.6253
0.2886 0.4444 940 0.3313 0.8381 0.8794 0.4901 0.6294
0.2641 0.4539 960 0.3095 0.8636 0.8611 0.6126 0.7159
0.2316 0.4634 980 0.3030 0.8603 0.8256 0.6364 0.7188
0.2116 0.4728 1000 0.3230 0.8581 0.8571 0.5929 0.7009
0.2134 0.4823 1020 0.3040 0.8625 0.8057 0.6719 0.7328
0.2139 0.4917 1040 0.3280 0.8448 0.9007 0.5020 0.6447
0.1949 0.5012 1060 0.3116 0.8625 0.8728 0.5968 0.7089
0.2255 0.5106 1080 0.3195 0.8592 0.8663 0.5889 0.7012
0.2452 0.5201 1100 0.3464 0.8426 0.8936 0.4980 0.6396
0.2038 0.5296 1120 0.3167 0.8570 0.8735 0.5731 0.6921
0.2496 0.5390 1140 0.3181 0.8592 0.8795 0.5771 0.6969
0.2864 0.5485 1160 0.3201 0.8514 0.8790 0.5455 0.6732
0.2342 0.5579 1180 0.3140 0.8647 0.8429 0.6364 0.7252
0.1366 0.5674 1200 0.3010 0.8681 0.8384 0.6561 0.7361
0.2301 0.5768 1220 0.3011 0.8625 0.8564 0.6126 0.7143
0.2873 0.5863 1240 0.3049 0.8625 0.8564 0.6126 0.7143
0.2467 0.5957 1260 0.3107 0.8625 0.8686 0.6008 0.7103
0.3175 0.6052 1280 0.3120 0.8581 0.8788 0.5731 0.6938
0.1988 0.6147 1300 0.3020 0.8636 0.8652 0.6087 0.7146
0.2081 0.6241 1320 0.3175 0.8559 0.8820 0.5613 0.6860
0.1784 0.6336 1340 0.2959 0.8647 0.8227 0.6601 0.7325
0.2712 0.6430 1360 0.3133 0.8592 0.85 0.6047 0.7067
0.2463 0.6525 1380 0.3180 0.8548 0.8427 0.5929 0.6961
0.3991 0.6619 1400 0.3167 0.8625 0.8817 0.5889 0.7062
0.154 0.6714 1420 0.3027 0.8636 0.8652 0.6087 0.7146
0.1944 0.6809 1440 0.3172 0.8625 0.8772 0.5929 0.7075
0.2434 0.6903 1460 0.3035 0.8692 0.8325 0.6680 0.7412
0.2346 0.6998 1480 0.3163 0.8625 0.8728 0.5968 0.7089
0.2532 0.7092 1500 0.2938 0.8659 0.83 0.6561 0.7329
0.1815 0.7187 1520 0.3156 0.8570 0.8924 0.5573 0.6861
0.1989 0.7281 1540 0.3187 0.8614 0.8855 0.5810 0.7017
0.1749 0.7376 1560 0.3169 0.8647 0.8786 0.6008 0.7136
0.2141 0.7470 1580 0.3046 0.8670 0.8556 0.6324 0.7273
0.2638 0.7565 1600 0.2976 0.8670 0.8446 0.6443 0.7309
0.2215 0.7660 1620 0.2927 0.8647 0.8359 0.6443 0.7277
0.2587 0.7754 1640 0.3179 0.8647 0.9018 0.5810 0.7067
0.2216 0.7849 1660 0.3046 0.8714 0.8914 0.6166 0.7290
0.2357 0.7943 1680 0.2967 0.8670 0.8556 0.6324 0.7273
0.1972 0.8038 1700 0.3002 0.8659 0.8626 0.6206 0.7218
0.2602 0.8132 1720 0.3001 0.8670 0.8757 0.6126 0.7209
0.1873 0.8227 1740 0.3046 0.8647 0.8786 0.6008 0.7136
0.1663 0.8322 1760 0.2978 0.8659 0.875 0.6087 0.7179
0.363 0.8416 1780 0.2977 0.8647 0.8743 0.6047 0.7150
0.1727 0.8511 1800 0.2989 0.8659 0.875 0.6087 0.7179
0.1995 0.8605 1820 0.3006 0.8625 0.8728 0.5968 0.7089
0.154 0.8700 1840 0.2966 0.8681 0.8681 0.6245 0.7264
0.1821 0.8794 1860 0.2968 0.8670 0.8674 0.6206 0.7235
0.2354 0.8889 1880 0.2952 0.8670 0.8595 0.6285 0.7260
0.3563 0.8983 1900 0.2933 0.8670 0.8556 0.6324 0.7273
0.2716 0.9078 1920 0.2968 0.8647 0.8619 0.6166 0.7189
0.1428 0.9173 1940 0.2970 0.8636 0.8652 0.6087 0.7146
0.2108 0.9267 1960 0.2979 0.8659 0.8667 0.6166 0.7206
0.1501 0.9362 1980 0.2986 0.8670 0.8715 0.6166 0.7222
0.2162 0.9456 2000 0.2984 0.8625 0.8644 0.6047 0.7116
0.3241 0.9551 2020 0.2993 0.8647 0.8701 0.6087 0.7163
0.2289 0.9645 2040 0.2976 0.8659 0.8708 0.6126 0.7193
0.1593 0.9740 2060 0.2984 0.8636 0.8693 0.6047 0.7133
0.2018 0.9835 2080 0.2984 0.8647 0.8659 0.6126 0.7176
0.2018 0.9929 2100 0.2975 0.8647 0.8701 0.6087 0.7163

Framework versions

  • PEFT 0.13.2
  • Transformers 4.46.0
  • Pytorch 2.5.1+cu124
  • Datasets 3.1.0
  • Tokenizers 0.20.3