Edit model card

segformer-b5-finetuned-segments-instryde-foot-test

This model is a fine-tuned version of nvidia/mit-b5 on the inStryde/inStrydeSegmentationFoot dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0496
  • Mean Iou: 0.4672
  • Mean Accuracy: 0.9344
  • Overall Accuracy: 0.9344
  • Per Category Iou: [0.0, 0.9343870058298716]
  • Per Category Accuracy: [nan, 0.9343870058298716]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
0.1392 0.23 20 0.2371 0.4064 0.8128 0.8128 [0.0, 0.8127920708469037] [nan, 0.8127920708469037]
0.2273 0.45 40 0.0993 0.4449 0.8898 0.8898 [0.0, 0.889800913515142] [nan, 0.889800913515142]
0.0287 0.68 60 0.0607 0.4190 0.8379 0.8379 [0.0, 0.8379005425233161] [nan, 0.8379005425233161]
0.03 0.91 80 0.0572 0.4072 0.8144 0.8144 [0.0, 0.8144304164916533] [nan, 0.8144304164916533]
0.0239 1.14 100 0.0577 0.3973 0.7946 0.7946 [0.0, 0.7946284254068925] [nan, 0.7946284254068925]
0.0196 1.36 120 0.0425 0.4227 0.8455 0.8455 [0.0, 0.8454754171184029] [nan, 0.8454754171184029]
0.0295 1.59 140 0.0368 0.4479 0.8958 0.8958 [0.0, 0.895802316554768] [nan, 0.895802316554768]
0.0297 1.82 160 0.0441 0.4561 0.9121 0.9121 [0.0, 0.9121241975954804] [nan, 0.9121241975954804]
0.0276 2.05 180 0.0332 0.4629 0.9258 0.9258 [0.0, 0.925774145806165] [nan, 0.925774145806165]
0.0148 2.27 200 0.0395 0.4310 0.8621 0.8621 [0.0, 0.8620666905637888] [nan, 0.8620666905637888]
0.012 2.5 220 0.0372 0.4381 0.8761 0.8761 [0.0, 0.8761025846276997] [nan, 0.8761025846276997]
0.0117 2.73 240 0.0339 0.4471 0.8941 0.8941 [0.0, 0.8941320836457919] [nan, 0.8941320836457919]
0.0198 2.95 260 0.0297 0.4485 0.8969 0.8969 [0.0, 0.8969491585060927] [nan, 0.8969491585060927]
0.0247 3.18 280 0.0303 0.4565 0.9130 0.9130 [0.0, 0.9130423308930413] [nan, 0.9130423308930413]
0.0115 3.41 300 0.0307 0.4533 0.9066 0.9066 [0.0, 0.9065626188900153] [nan, 0.9065626188900153]
0.0164 3.64 320 0.0330 0.4549 0.9097 0.9097 [0.0, 0.9097436483868343] [nan, 0.9097436483868343]
0.0114 3.86 340 0.0362 0.4425 0.8850 0.8850 [0.0, 0.8849727418868903] [nan, 0.8849727418868903]
0.012 4.09 360 0.0321 0.4582 0.9164 0.9164 [0.0, 0.9164498699219532] [nan, 0.9164498699219532]
0.0153 4.32 380 0.0321 0.4572 0.9144 0.9144 [0.0, 0.9144310762281544] [nan, 0.9144310762281544]
0.0115 4.55 400 0.0307 0.4573 0.9145 0.9145 [0.0, 0.9145300367033407] [nan, 0.9145300367033407]
0.0139 4.77 420 0.0330 0.4678 0.9357 0.9357 [0.0, 0.935664695520609] [nan, 0.935664695520609]
0.014 5.0 440 0.0317 0.4635 0.9271 0.9271 [0.0, 0.9270562337402442] [nan, 0.9270562337402442]
0.0197 5.23 460 0.0320 0.4678 0.9356 0.9356 [0.0, 0.9355745315321061] [nan, 0.9355745315321061]
0.0086 5.45 480 0.0337 0.4607 0.9214 0.9214 [0.0, 0.9213528116870122] [nan, 0.9213528116870122]
0.3103 5.68 500 0.0338 0.4548 0.9096 0.9096 [0.0, 0.9095853116265363] [nan, 0.9095853116265363]
0.0088 5.91 520 0.0305 0.4635 0.9270 0.9270 [0.0, 0.9270243464760175] [nan, 0.9270243464760175]
0.0119 6.14 540 0.0299 0.4680 0.9359 0.9359 [0.0, 0.9359494817769782] [nan, 0.9359494817769782]
0.0114 6.36 560 0.0314 0.4574 0.9148 0.9148 [0.0, 0.914796130425508] [nan, 0.914796130425508]
0.0122 6.59 580 0.0289 0.4613 0.9227 0.9227 [0.0, 0.9226920767845322] [nan, 0.9226920767845322]
0.0164 6.82 600 0.0312 0.4620 0.9240 0.9240 [0.0, 0.9239807620836238] [nan, 0.9239807620836238]
0.0062 7.05 620 0.0335 0.4605 0.9210 0.9210 [0.0, 0.9209954544155065] [nan, 0.9209954544155065]
0.0089 7.27 640 0.0309 0.4659 0.9317 0.9317 [0.0, 0.9317029778306545] [nan, 0.9317029778306545]
0.0251 7.5 660 0.0291 0.4734 0.9468 0.9468 [0.0, 0.9467878529315391] [nan, 0.9467878529315391]
0.0065 7.73 680 0.0326 0.4598 0.9195 0.9195 [0.0, 0.9195297398219151] [nan, 0.9195297398219151]
0.0056 7.95 700 0.0310 0.4606 0.9213 0.9213 [0.0, 0.9212714441851925] [nan, 0.9212714441851925]
0.0099 8.18 720 0.0345 0.4503 0.9006 0.9006 [0.0, 0.9006183930138303] [nan, 0.9006183930138303]
0.0103 8.41 740 0.0335 0.4539 0.9078 0.9078 [0.0, 0.9077512441530853] [nan, 0.9077512441530853]
0.0065 8.64 760 0.0334 0.4544 0.9088 0.9088 [0.0, 0.9087936278250467] [nan, 0.9087936278250467]
0.0047 8.86 780 0.0341 0.4557 0.9114 0.9114 [0.0, 0.9114215782216583] [nan, 0.9114215782216583]
0.0105 9.09 800 0.0315 0.4597 0.9195 0.9195 [0.0, 0.9194703635368034] [nan, 0.9194703635368034]
0.0087 9.32 820 0.0329 0.4583 0.9166 0.9166 [0.0, 0.9165708216138474] [nan, 0.9165708216138474]
0.0122 9.55 840 0.0357 0.4537 0.9073 0.9073 [0.0, 0.9073004242105703] [nan, 0.9073004242105703]
0.0057 9.77 860 0.0319 0.4621 0.9241 0.9241 [0.0, 0.9241050124580242] [nan, 0.9241050124580242]
0.0068 10.0 880 0.0342 0.4539 0.9078 0.9078 [0.0, 0.907799624829843] [nan, 0.907799624829843]
0.0095 10.23 900 0.0340 0.4578 0.9156 0.9156 [0.0, 0.9155933120311748] [nan, 0.9155933120311748]
0.0043 10.45 920 0.0319 0.4636 0.9272 0.9272 [0.0, 0.9271771854321385] [nan, 0.9271771854321385]
0.0049 10.68 940 0.0308 0.4659 0.9319 0.9319 [0.0, 0.9318525181042692] [nan, 0.9318525181042692]
0.005 10.91 960 0.0319 0.4640 0.9281 0.9281 [0.0, 0.9280612323438019] [nan, 0.9280612323438019]
0.0043 11.14 980 0.0313 0.4653 0.9306 0.9306 [0.0, 0.930638602941985] [nan, 0.930638602941985]
0.0084 11.36 1000 0.0321 0.4632 0.9264 0.9264 [0.0, 0.9264294840640648] [nan, 0.9264294840640648]
0.0044 11.59 1020 0.0320 0.4643 0.9285 0.9285 [0.0, 0.9285241474555063] [nan, 0.9285241474555063]
0.0044 11.82 1040 0.0321 0.4661 0.9321 0.9321 [0.0, 0.9321098153397533] [nan, 0.9321098153397533]
0.0057 12.05 1060 0.0338 0.4626 0.9253 0.9253 [0.0, 0.9252518544093489] [nan, 0.9252518544093489]
0.0064 12.27 1080 0.0348 0.4616 0.9231 0.9231 [0.0, 0.9231450958487181] [nan, 0.9231450958487181]
0.0075 12.5 1100 0.0331 0.4618 0.9237 0.9237 [0.0, 0.9236706859280404] [nan, 0.9236706859280404]
0.0103 12.73 1120 0.0317 0.4704 0.9408 0.9408 [0.0, 0.9408425274945187] [nan, 0.9408425274945187]
0.0053 12.95 1140 0.0320 0.4704 0.9407 0.9407 [0.0, 0.9407292727284723] [nan, 0.9407292727284723]
0.0073 13.18 1160 0.0331 0.4652 0.9305 0.9305 [0.0, 0.9304681710124976] [nan, 0.9304681710124976]
0.0052 13.41 1180 0.0342 0.4664 0.9328 0.9328 [0.0, 0.9328047377877275] [nan, 0.9328047377877275]
0.0089 13.64 1200 0.0322 0.4676 0.9353 0.9353 [0.0, 0.9352996413232555] [nan, 0.9352996413232555]
0.0054 13.86 1220 0.0332 0.4655 0.9311 0.9311 [0.0, 0.9310509382552609] [nan, 0.9310509382552609]
0.0057 14.09 1240 0.0333 0.4661 0.9321 0.9321 [0.0, 0.9321439017256508] [nan, 0.9321439017256508]
0.0047 14.32 1260 0.0346 0.4639 0.9278 0.9278 [0.0, 0.9277522557490538] [nan, 0.9277522557490538]
0.0092 14.55 1280 0.0380 0.4583 0.9166 0.9166 [0.0, 0.9166290983381238] [nan, 0.9166290983381238]
0.0066 14.77 1300 0.0338 0.4638 0.9277 0.9277 [0.0, 0.927687381659765] [nan, 0.927687381659765]
0.0076 15.0 1320 0.0347 0.4640 0.9280 0.9280 [0.0, 0.9279897608895007] [nan, 0.9279897608895007]
0.0054 15.23 1340 0.0345 0.4647 0.9295 0.9295 [0.0, 0.9294664710914461] [nan, 0.9294664710914461]
0.0036 15.45 1360 0.0349 0.4666 0.9332 0.9332 [0.0, 0.9331950818842955] [nan, 0.9331950818842955]
0.004 15.68 1380 0.0352 0.4617 0.9234 0.9234 [0.0, 0.9234408777134413] [nan, 0.9234408777134413]
0.0042 15.91 1400 0.0357 0.4622 0.9244 0.9244 [0.0, 0.9244282833436326] [nan, 0.9244282833436326]
0.0048 16.14 1420 0.0370 0.4586 0.9172 0.9172 [0.0, 0.9171546884174461] [nan, 0.9171546884174461]
0.0043 16.36 1440 0.0345 0.4647 0.9294 0.9294 [0.0, 0.9294411811922318] [nan, 0.9294411811922318]
0.0027 16.59 1460 0.0354 0.4667 0.9334 0.9334 [0.0, 0.9333754098613014] [nan, 0.9333754098613014]
0.0057 16.82 1480 0.0364 0.4689 0.9379 0.9379 [0.0, 0.9378913062122988] [nan, 0.9378913062122988]
0.0035 17.05 1500 0.0363 0.4662 0.9325 0.9325 [0.0, 0.9324682721720945] [nan, 0.9324682721720945]
0.0029 17.27 1520 0.0348 0.4674 0.9347 0.9347 [0.0, 0.9347212723238338] [nan, 0.9347212723238338]
0.0043 17.5 1540 0.0362 0.4648 0.9295 0.9295 [0.0, 0.9295390421065827] [nan, 0.9295390421065827]
0.0041 17.73 1560 0.0347 0.4664 0.9328 0.9328 [0.0, 0.9328487202211436] [nan, 0.9328487202211436]
0.003 17.95 1580 0.0364 0.4649 0.9297 0.9297 [0.0, 0.9297237683269303] [nan, 0.9297237683269303]
0.0121 18.18 1600 0.0364 0.4650 0.9300 0.9300 [0.0, 0.9299920611707684] [nan, 0.9299920611707684]
0.004 18.41 1620 0.0369 0.4667 0.9334 0.9334 [0.0, 0.9334259896597299] [nan, 0.9334259896597299]
0.0035 18.64 1640 0.0368 0.4636 0.9272 0.9272 [0.0, 0.9272475573256042] [nan, 0.9272475573256042]
0.0031 18.86 1660 0.0358 0.4665 0.9330 0.9330 [0.0, 0.9329784683997212] [nan, 0.9329784683997212]
0.0032 19.09 1680 0.0357 0.4661 0.9322 0.9322 [0.0, 0.9321515986514985] [nan, 0.9321515986514985]
0.0047 19.32 1700 0.0371 0.4621 0.9243 0.9243 [0.0, 0.9242886391175364] [nan, 0.9242886391175364]
0.0056 19.55 1720 0.0359 0.4663 0.9326 0.9326 [0.0, 0.9326277084932278] [nan, 0.9326277084932278]
0.0033 19.77 1740 0.0348 0.4694 0.9389 0.9389 [0.0, 0.9388523223824404] [nan, 0.9388523223824404]
0.0049 20.0 1760 0.0394 0.4612 0.9224 0.9224 [0.0, 0.9223918966764674] [nan, 0.9223918966764674]
0.0058 20.23 1780 0.0368 0.4660 0.9321 0.9321 [0.0, 0.9320724302713497] [nan, 0.9320724302713497]
0.003 20.45 1800 0.0370 0.4686 0.9372 0.9372 [0.0, 0.9371787907909581] [nan, 0.9371787907909581]
0.0058 20.68 1820 0.0363 0.4665 0.9330 0.9330 [0.0, 0.9329949618122522] [nan, 0.9329949618122522]
0.0083 20.91 1840 0.0351 0.4661 0.9322 0.9322 [0.0, 0.9321834859157253] [nan, 0.9321834859157253]
0.0036 21.14 1860 0.0353 0.4667 0.9333 0.9333 [0.0, 0.9333149340153543] [nan, 0.9333149340153543]
0.0032 21.36 1880 0.0373 0.4657 0.9314 0.9314 [0.0, 0.93137640826254] [nan, 0.93137640826254]
0.005 21.59 1900 0.0391 0.4647 0.9294 0.9294 [0.0, 0.929370809298766] [nan, 0.929370809298766]
0.0049 21.82 1920 0.0364 0.4701 0.9403 0.9403 [0.0, 0.9402795523467927] [nan, 0.9402795523467927]
0.0044 22.05 1940 0.0368 0.4672 0.9343 0.9343 [0.0, 0.9343111361322288] [nan, 0.9343111361322288]
0.0038 22.27 1960 0.0367 0.4663 0.9325 0.9325 [0.0, 0.932513354166346] [nan, 0.932513354166346]
0.0032 22.5 1980 0.0378 0.4679 0.9358 0.9358 [0.0, 0.9358483221801213] [nan, 0.9358483221801213]
0.0039 22.73 2000 0.0381 0.4653 0.9306 0.9306 [0.0, 0.9305517376359882] [nan, 0.9305517376359882]
0.0032 22.95 2020 0.0385 0.4651 0.9301 0.9301 [0.0, 0.9301262075926875] [nan, 0.9301262075926875]
0.0058 23.18 2040 0.0381 0.4654 0.9309 0.9309 [0.0, 0.9308673115957486] [nan, 0.9308673115957486]
0.0049 23.41 2060 0.0377 0.4658 0.9316 0.9316 [0.0, 0.9316194112071639] [nan, 0.9316194112071639]
0.0032 23.64 2080 0.0373 0.4692 0.9384 0.9384 [0.0, 0.9384256927783043] [nan, 0.9384256927783043]
0.0056 23.86 2100 0.0390 0.4646 0.9292 0.9292 [0.0, 0.9292465589243656] [nan, 0.9292465589243656]
0.003 24.09 2120 0.0383 0.4658 0.9317 0.9317 [0.0, 0.9316765883706047] [nan, 0.9316765883706047]
0.0037 24.32 2140 0.0376 0.4668 0.9337 0.9337 [0.0, 0.9336755899693663] [nan, 0.9336755899693663]
0.0025 24.55 2160 0.0390 0.4663 0.9326 0.9326 [0.0, 0.9326145137632029] [nan, 0.9326145137632029]
0.0039 24.77 2180 0.0381 0.4688 0.9376 0.9376 [0.0, 0.937613117320942] [nan, 0.937613117320942]
0.0031 25.0 2200 0.0395 0.4645 0.9291 0.9291 [0.0, 0.9290629322648534] [nan, 0.9290629322648534]
0.0026 25.23 2220 0.0389 0.4668 0.9336 0.9336 [0.0, 0.9335678330074968] [nan, 0.9335678330074968]
0.0028 25.45 2240 0.0375 0.4680 0.9359 0.9359 [0.0, 0.9359329883644473] [nan, 0.9359329883644473]
0.0039 25.68 2260 0.0404 0.4656 0.9312 0.9312 [0.0, 0.9312004785288756] [nan, 0.9312004785288756]
0.004 25.91 2280 0.0371 0.4716 0.9431 0.9431 [0.0, 0.9431021250112706] [nan, 0.9431021250112706]
0.0048 26.14 2300 0.0373 0.4700 0.9400 0.9400 [0.0, 0.9399639783870323] [nan, 0.9399639783870323]
0.0033 26.36 2320 0.0385 0.4688 0.9377 0.9377 [0.0, 0.9376560001935227] [nan, 0.9376560001935227]
0.0042 26.59 2340 0.0374 0.4686 0.9372 0.9372 [0.0, 0.9371743925476165] [nan, 0.9371743925476165]
0.0048 26.82 2360 0.0393 0.4660 0.9320 0.9320 [0.0, 0.9319789676003404] [nan, 0.9319789676003404]
0.0047 27.05 2380 0.0393 0.4650 0.9300 0.9300 [0.0, 0.9300162515091472] [nan, 0.9300162515091472]
0.0048 27.27 2400 0.0389 0.4670 0.9340 0.9340 [0.0, 0.9339867656857851] [nan, 0.9339867656857851]
0.004 27.5 2420 0.0388 0.4673 0.9346 0.9346 [0.0, 0.9345750307327253] [nan, 0.9345750307327253]
0.0051 27.73 2440 0.0386 0.4655 0.9309 0.9309 [0.0, 0.9309002984208107] [nan, 0.9309002984208107]
0.0045 27.95 2460 0.0395 0.4664 0.9328 0.9328 [0.0, 0.932816832956917] [nan, 0.932816832956917]
0.0042 28.18 2480 0.0393 0.4642 0.9285 0.9285 [0.0, 0.9284856628262672] [nan, 0.9284856628262672]
0.0035 28.41 2500 0.0396 0.4667 0.9333 0.9333 [0.0, 0.9333083366503419] [nan, 0.9333083366503419]
0.0036 28.64 2520 0.0395 0.4664 0.9327 0.9327 [0.0, 0.9327288680900848] [nan, 0.9327288680900848]
0.0035 28.86 2540 0.0377 0.4675 0.9349 0.9349 [0.0, 0.9349378858084081] [nan, 0.9349378858084081]
0.0029 29.09 2560 0.0402 0.4658 0.9315 0.9315 [0.0, 0.9315479397528627] [nan, 0.9315479397528627]
0.0042 29.32 2580 0.0398 0.4691 0.9383 0.9383 [0.0, 0.9382893472347145] [nan, 0.9382893472347145]
0.0029 29.55 2600 0.0405 0.4668 0.9336 0.9336 [0.0, 0.9336129150017483] [nan, 0.9336129150017483]
0.0023 29.77 2620 0.0402 0.4666 0.9332 0.9332 [0.0, 0.9332071770534849] [nan, 0.9332071770534849]
0.0036 30.0 2640 0.0417 0.4648 0.9296 0.9296 [0.0, 0.9296435003859459] [nan, 0.9296435003859459]
0.0045 30.23 2660 0.0395 0.4674 0.9348 0.9348 [0.0, 0.9347960424606412] [nan, 0.9347960424606412]
0.0025 30.45 2680 0.0400 0.4695 0.9390 0.9390 [0.0, 0.9390392477244589] [nan, 0.9390392477244589]
0.0032 30.68 2700 0.0404 0.4673 0.9347 0.9347 [0.0, 0.9346926837421135] [nan, 0.9346926837421135]
0.0047 30.91 2720 0.0416 0.4651 0.9303 0.9303 [0.0, 0.9302790465488084] [nan, 0.9302790465488084]
0.0024 31.14 2740 0.0403 0.4677 0.9355 0.9355 [0.0, 0.9354997613952987] [nan, 0.9354997613952987]
0.0037 31.36 2760 0.0406 0.4677 0.9354 0.9354 [0.0, 0.9354469824751994] [nan, 0.9354469824751994]
0.0031 31.59 2780 0.0414 0.4671 0.9343 0.9343 [0.0, 0.9342858462330146] [nan, 0.9342858462330146]
0.0036 31.82 2800 0.0404 0.4670 0.9339 0.9339 [0.0, 0.9339152942314839] [nan, 0.9339152942314839]
0.003 32.05 2820 0.0411 0.4678 0.9355 0.9355 [0.0, 0.9355151552469944] [nan, 0.9355151552469944]
0.0038 32.27 2840 0.0423 0.4672 0.9344 0.9344 [0.0, 0.9344221917766045] [nan, 0.9344221917766045]
0.0023 32.5 2860 0.0433 0.4657 0.9313 0.9313 [0.0, 0.9313401227549717] [nan, 0.9313401227549717]
0.003 32.73 2880 0.0421 0.4682 0.9363 0.9363 [0.0, 0.9363365271910399] [nan, 0.9363365271910399]
0.0031 32.95 2900 0.0428 0.4679 0.9357 0.9357 [0.0, 0.9357086779540251] [nan, 0.9357086779540251]
0.0026 33.18 2920 0.0448 0.4656 0.9311 0.9311 [0.0, 0.9311081154187018] [nan, 0.9311081154187018]
0.0031 33.41 2940 0.0456 0.4639 0.9279 0.9279 [0.0, 0.9278929995359854] [nan, 0.9278929995359854]
0.0022 33.64 2960 0.0424 0.4674 0.9349 0.9349 [0.0, 0.9348851068883088] [nan, 0.9348851068883088]
0.0025 33.86 2980 0.0434 0.4654 0.9308 0.9308 [0.0, 0.9307782471680811] [nan, 0.9307782471680811]
0.0025 34.09 3000 0.0418 0.4675 0.9351 0.9351 [0.0, 0.9350610366219732] [nan, 0.9350610366219732]
0.003 34.32 3020 0.0424 0.4674 0.9349 0.9349 [0.0, 0.9348653147932716] [nan, 0.9348653147932716]
0.0021 34.55 3040 0.0412 0.4687 0.9374 0.9374 [0.0, 0.9374437849522901] [nan, 0.9374437849522901]
0.0043 34.77 3060 0.0412 0.4676 0.9352 0.9352 [0.0, 0.9352446632814854] [nan, 0.9352446632814854]
0.005 35.0 3080 0.0428 0.4675 0.9350 0.9350 [0.0, 0.9349807686809888] [nan, 0.9349807686809888]
0.003 35.23 3100 0.0430 0.4672 0.9344 0.9344 [0.0, 0.934393603194884] [nan, 0.934393603194884]
0.0027 35.45 3120 0.0452 0.4652 0.9303 0.9303 [0.0, 0.9303428210772617] [nan, 0.9303428210772617]
0.0022 35.68 3140 0.0441 0.4653 0.9306 0.9306 [0.0, 0.9305847244610502] [nan, 0.9305847244610502]
0.0029 35.91 3160 0.0425 0.4671 0.9342 0.9342 [0.0, 0.9341692927844619] [nan, 0.9341692927844619]
0.0022 36.14 3180 0.0438 0.4679 0.9358 0.9358 [0.0, 0.9358153353550592] [nan, 0.9358153353550592]
0.0028 36.36 3200 0.0443 0.4680 0.9359 0.9359 [0.0, 0.935929689681941] [nan, 0.935929689681941]
0.0025 36.59 3220 0.0433 0.4682 0.9365 0.9365 [0.0, 0.9364948639513379] [nan, 0.9364948639513379]
0.003 36.82 3240 0.0439 0.4680 0.9359 0.9359 [0.0, 0.9359340879252827] [nan, 0.9359340879252827]
0.0027 37.05 3260 0.0462 0.4665 0.9331 0.9331 [0.0, 0.9330587363407056] [nan, 0.9330587363407056]
0.004 37.27 3280 0.0447 0.4675 0.9350 0.9350 [0.0, 0.9349917642893428] [nan, 0.9349917642893428]
0.0032 37.5 3300 0.0442 0.4683 0.9367 0.9367 [0.0, 0.9366916853408749] [nan, 0.9366916853408749]
0.0019 37.73 3320 0.0454 0.4674 0.9347 0.9347 [0.0, 0.9347102767154798] [nan, 0.9347102767154798]
0.0028 37.95 3340 0.0451 0.4674 0.9349 0.9349 [0.0, 0.9348543191849176] [nan, 0.9348543191849176]
0.0023 38.18 3360 0.0457 0.4669 0.9337 0.9337 [0.0, 0.9337228710852885] [nan, 0.9337228710852885]
0.0028 38.41 3380 0.0454 0.4675 0.9351 0.9351 [0.0, 0.9350764304736688] [nan, 0.9350764304736688]
0.0024 38.64 3400 0.0467 0.4677 0.9354 0.9354 [0.0, 0.9353568184866964] [nan, 0.9353568184866964]
0.0023 38.86 3420 0.0463 0.4669 0.9337 0.9337 [0.0, 0.9337096763552637] [nan, 0.9337096763552637]
0.0029 39.09 3440 0.0456 0.4664 0.9328 0.9328 [0.0, 0.9328289281261064] [nan, 0.9328289281261064]
0.0026 39.32 3460 0.0453 0.4686 0.9372 0.9372 [0.0, 0.9371578991350854] [nan, 0.9371578991350854]
0.0037 39.55 3480 0.0458 0.4678 0.9356 0.9356 [0.0, 0.9356097174788389] [nan, 0.9356097174788389]
0.0025 39.77 3500 0.0468 0.4671 0.9342 0.9342 [0.0, 0.9342275695087382] [nan, 0.9342275695087382]
0.0048 40.0 3520 0.0459 0.4668 0.9335 0.9335 [0.0, 0.933527149256587] [nan, 0.933527149256587]
0.0027 40.23 3540 0.0468 0.4658 0.9315 0.9315 [0.0, 0.9315490393136981] [nan, 0.9315490393136981]
0.0019 40.45 3560 0.0465 0.4662 0.9324 0.9324 [0.0, 0.9323792077444268] [nan, 0.9323792077444268]
0.0033 40.68 3580 0.0459 0.4674 0.9348 0.9348 [0.0, 0.9348015402648182] [nan, 0.9348015402648182]
0.004 40.91 3600 0.0467 0.4667 0.9333 0.9333 [0.0, 0.9333358256712269] [nan, 0.9333358256712269]
0.0022 41.14 3620 0.0469 0.4665 0.9331 0.9331 [0.0, 0.9330521389756931] [nan, 0.9330521389756931]
0.0036 41.36 3640 0.0458 0.4676 0.9352 0.9352 [0.0, 0.9352479619639916] [nan, 0.9352479619639916]
0.0024 41.59 3660 0.0468 0.4671 0.9342 0.9342 [0.0, 0.9341769897103097] [nan, 0.9341769897103097]
0.0021 41.82 3680 0.0466 0.4658 0.9317 0.9317 [0.0, 0.9316776879314402] [nan, 0.9316776879314402]
0.0032 42.05 3700 0.0472 0.4666 0.9332 0.9332 [0.0, 0.9331807875934351] [nan, 0.9331807875934351]
0.0023 42.27 3720 0.0470 0.4673 0.9347 0.9347 [0.0, 0.9346827876945948] [nan, 0.9346827876945948]
0.003 42.5 3740 0.0474 0.4661 0.9321 0.9321 [0.0, 0.9321482999689924] [nan, 0.9321482999689924]
0.0025 42.73 3760 0.0483 0.4656 0.9313 0.9313 [0.0, 0.9312851447132016] [nan, 0.9312851447132016]
0.0019 42.95 3780 0.0471 0.4669 0.9338 0.9338 [0.0, 0.9338130350737915] [nan, 0.9338130350737915]
0.0032 43.18 3800 0.0463 0.4682 0.9365 0.9365 [0.0, 0.9364508815179218] [nan, 0.9364508815179218]
0.0026 43.41 3820 0.0484 0.4657 0.9315 0.9315 [0.0, 0.9314698709335492] [nan, 0.9314698709335492]
0.0019 43.64 3840 0.0477 0.4673 0.9345 0.9345 [0.0, 0.9345486412726757] [nan, 0.9345486412726757]
0.003 43.86 3860 0.0472 0.4688 0.9375 0.9375 [0.0, 0.9375218537716036] [nan, 0.9375218537716036]
0.0025 44.09 3880 0.0473 0.4670 0.9340 0.9340 [0.0, 0.9339999604158099] [nan, 0.9339999604158099]
0.0019 44.32 3900 0.0481 0.4670 0.9340 0.9340 [0.0, 0.9340263498758595] [nan, 0.9340263498758595]
0.0024 44.55 3920 0.0478 0.4671 0.9343 0.9343 [0.0, 0.9342561580904587] [nan, 0.9342561580904587]
0.0021 44.77 3940 0.0479 0.4677 0.9355 0.9355 [0.0, 0.9354579780835535] [nan, 0.9354579780835535]
0.0019 45.0 3960 0.0479 0.4682 0.9363 0.9363 [0.0, 0.9363112372918256] [nan, 0.9363112372918256]
0.0024 45.23 3980 0.0481 0.4681 0.9362 0.9362 [0.0, 0.9362133763774748] [nan, 0.9362133763774748]
0.0023 45.45 4000 0.0497 0.4670 0.9340 0.9340 [0.0, 0.933970272273254] [nan, 0.933970272273254]
0.0027 45.68 4020 0.0487 0.4671 0.9343 0.9343 [0.0, 0.9342781493071667] [nan, 0.9342781493071667]
0.0023 45.91 4040 0.0477 0.4672 0.9344 0.9344 [0.0, 0.9344309882632876] [nan, 0.9344309882632876]
0.003 46.14 4060 0.0485 0.4678 0.9356 0.9356 [0.0, 0.9355877262621309] [nan, 0.9355877262621309]
0.0017 46.36 4080 0.0488 0.4677 0.9354 0.9354 [0.0, 0.9353678140950504] [nan, 0.9353678140950504]
0.0022 46.59 4100 0.0481 0.4668 0.9337 0.9337 [0.0, 0.9336634948001769] [nan, 0.9336634948001769]
0.0032 46.82 4120 0.0487 0.4676 0.9352 0.9352 [0.0, 0.935249061524827] [nan, 0.935249061524827]
0.0021 47.05 4140 0.0483 0.4675 0.9351 0.9351 [0.0, 0.9350885256428583] [nan, 0.9350885256428583]
0.002 47.27 4160 0.0486 0.4673 0.9347 0.9347 [0.0, 0.9346530995520389] [nan, 0.9346530995520389]
0.0028 47.5 4180 0.0487 0.4675 0.9349 0.9349 [0.0, 0.9349224919567125] [nan, 0.9349224919567125]
0.0026 47.73 4200 0.0482 0.4667 0.9335 0.9335 [0.0, 0.9334589764847919] [nan, 0.9334589764847919]
0.0022 47.95 4220 0.0490 0.4670 0.9341 0.9341 [0.0, 0.9340769296742881] [nan, 0.9340769296742881]
0.0027 48.18 4240 0.0489 0.4679 0.9358 0.9358 [0.0, 0.9358153353550592] [nan, 0.9358153353550592]
0.0021 48.41 4260 0.0491 0.4676 0.9353 0.9353 [0.0, 0.9352864465932307] [nan, 0.9352864465932307]
0.0024 48.64 4280 0.0491 0.4672 0.9344 0.9344 [0.0, 0.9343804084648591] [nan, 0.9343804084648591]
0.0025 48.86 4300 0.0493 0.4675 0.9349 0.9349 [0.0, 0.9349466822950914] [nan, 0.9349466822950914]
0.0022 49.09 4320 0.0484 0.4677 0.9354 0.9354 [0.0, 0.9353623162908734] [nan, 0.9353623162908734]
0.0027 49.32 4340 0.0480 0.4677 0.9354 0.9354 [0.0, 0.9354117965284665] [nan, 0.9354117965284665]
0.0018 49.55 4360 0.0498 0.4675 0.9350 0.9350 [0.0, 0.9349983616543552] [nan, 0.9349983616543552]
0.0021 49.77 4380 0.0493 0.4672 0.9345 0.9345 [0.0, 0.9344738711358683] [nan, 0.9344738711358683]
0.0017 50.0 4400 0.0496 0.4672 0.9344 0.9344 [0.0, 0.9343870058298716] [nan, 0.9343870058298716]

Framework versions

  • Transformers 4.21.1
  • Pytorch 1.12.0+cu113
  • Datasets 2.4.0
  • Tokenizers 0.12.1
Downloads last month
17
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.