Edit model card

segformer-b2-finetuned

This model is a fine-tuned version of nvidia/mit-b2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5204
  • Mean Iou: 0.1825
  • Mean Accuracy: 0.2798
  • Overall Accuracy: 0.6420
  • Per Category Iou: [0.9142691288036651, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.3142004341534009, 0.0, 0.003682913518478972, nan, nan, 0.0, 0.09172351933896877, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.09730988810230647, nan, nan, nan, 0.0, 0.0, 0.9051505354829877, 0.0, nan, nan, nan, nan, nan, 0.3923942271862065, nan, nan, nan, 0.0, 0.0, nan, 0.35265901085368995, nan, 0.19254235491492325, 0.049966760972743995, 0.11681963220492537, 0.10314154531218374, nan, nan, 0.3096201109834452, nan, 0.00769115639193742, nan, 0.0, nan, 0.28766783984147737, 0.8922548763781069, nan, nan, 0.0, nan, nan, 0.0, 0.45913823493730865, 0.00039839497899743395, nan, nan, 0.3294171958059549, 0.0, 0.0, 0.2671846076829642, nan, nan, 0.0, nan, 0.0, nan, 0.3159265837773831, nan, 0.331420855139111, nan, 0.6764366245603519, 0.45869407404577234, nan, 0.6076370770804199, 0.0, 0.28964539522710336, nan, nan, nan, 0.01132884262094305, 0.07350650765125524, nan, 0.45421384725808006, nan, nan, nan, nan, 0.0, 0.0, 0.0]
  • Per Category Accuracy: [0.9647457074649063, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.5664851113792766, 0.0, 0.005903479602703702, nan, nan, 0.0, 0.24703884276950558, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.10665478609973669, nan, nan, nan, 0.0, 0.0, 0.9731339228095599, 0.0, nan, nan, nan, nan, nan, 0.47161212620060644, nan, nan, nan, 0.0, 0.0, nan, 0.40322527832351013, nan, 0.2442109998758946, 0.07079225115228281, 0.4617999839239304, 0.10811595497587993, nan, nan, 0.4783842629632726, nan, 0.00769115639193742, nan, 0.0, nan, 0.7675853747883402, 0.9796493929735344, nan, nan, 0.0, nan, nan, nan, 0.8162476926297361, 0.0008705784659068079, nan, nan, 0.4390048922813946, 0.0, nan, 0.5876177795985252, nan, nan, 0.0, nan, 0.0, nan, 0.5925394425509742, nan, 0.947028699455884, nan, 0.930032723306266, 0.7094970746409787, nan, 0.9363039912520503, 0.0, 0.32225188061150206, nan, nan, nan, 0.015030467163168585, 0.08358722845606995, nan, 0.47285248278361725, nan, nan, nan, nan, 0.0, 0.0, 0.0]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
2.1424 1.0 80 2.0996 0.0546 0.1090 0.5538 [0.8635263872732099, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.04096287598229609, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0006413344762643675, 0.11187094662246744, 0.0, 0.0, nan, 0.009208358967234912, nan, 0.0, nan, 0.0, nan, 0.1952057456966726, 0.5391626011901571, nan, nan, 0.0, nan, nan, nan, 0.30213360552577073, 0.0013042568629283906, nan, nan, 0.10099189548808515, 0.0, 0.0, 0.20232205115780788, nan, nan, 0.0, nan, 0.0, nan, 0.00014188659088253458, nan, 0.02997904308599439, nan, 0.3178189486368429, 0.0, nan, 0.06737796193189176, 0.0, 0.00022243791959880287, nan, nan, nan, 0.0, 0.0010399643440796315, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0] [0.9668328903987039, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0410746430876176, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0006414609037401865, 0.6592706823100717, 0.0, nan, nan, 0.009234602693160423, nan, 0.0, nan, 0.0, nan, 0.8747716892052864, 0.5402535544175053, nan, nan, 0.0, nan, nan, nan, 0.3867882040254493, 0.001308099951336896, nan, nan, 0.15378636055524447, 0.0, nan, 0.45614843643315583, nan, nan, 0.0, nan, 0.0, nan, 0.0001452135065645047, nan, 0.03001529215121448, nan, 0.38358986789480065, 0.0, nan, 0.8367799826327469, 0.0, 0.00022243791959880287, nan, nan, nan, 0.0, 0.0010403507468232146, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0]
2.2828 2.0 160 1.7615 0.0945 0.1779 0.5867 [0.8958283628425443, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.2650072796505768, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.34558464610605727, 0.0, nan, nan, nan, nan, nan, 0.001342492924699451, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.03134658554836191, 0.07434650697545013, 0.14816743444538535, 0.0, nan, nan, 0.1642281017568831, nan, 0.0, nan, 0.0, nan, 0.21751635761310004, 0.8482858476607933, nan, nan, 0.0, nan, nan, 0.0, 0.3355451231346219, 0.0007206612479542298, nan, nan, 0.17902599374530498, 0.0, 0.0, 0.1821567724347404, nan, nan, 0.0, nan, 0.0, nan, 0.08738143266136364, nan, 0.3523381734895822, nan, 0.4578501392083283, 0.026466256863995993, nan, 0.19600262650085765, 0.0, 0.0006358605478336033, nan, nan, nan, 0.0, 0.01185518249609946, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0] [0.9620532316859247, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.30869834643358013, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.35267811654420506, 0.0, nan, nan, nan, nan, nan, 0.0013435898056636548, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.03487620485665825, 0.0877956587177666, 0.9495806825184652, 0.0, nan, nan, 0.18737884559505472, nan, 0.0, nan, 0.0, nan, 0.6075359853676155, 0.9599076030512481, nan, nan, 0.0, nan, nan, nan, 0.730404003882091, 0.0011697002977824804, nan, nan, 0.24758333087736878, 0.0, nan, 0.6244537757749556, nan, nan, 0.0, nan, 0.0, nan, 0.17650274624366827, nan, 0.5634979906824567, nan, 0.9168100836262272, 0.05246734826546894, nan, 0.9408226932106906, 0.0, 0.000647092129741972, nan, nan, nan, 0.0, 0.011951648460528597, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0]
1.6882 3.0 240 1.6107 0.1184 0.2217 0.6185 [0.9078840245470673, nan, nan, 0.008295603914122974, 0.0, 0.0, nan, nan, 0.2708327282233001, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.01068541636772373, nan, nan, nan, 0.0, 0.0, 0.46267216714352255, 0.0, nan, nan, nan, nan, nan, 0.024547604967474868, nan, nan, nan, 0.0, 0.0, nan, 0.0008676150408433873, nan, 0.1433677278791662, 0.07036698250826051, 0.19145609711560646, 0.005432652527979793, nan, nan, 0.3511881601309627, nan, 0.0, nan, 0.0, nan, 0.3110435634268891, 0.7803689251507627, nan, nan, 0.0, nan, nan, 0.0, 0.37167048021355864, 0.00796633894165082, nan, nan, 0.20235549099892491, 0.0, 0.0, 0.19702921038237098, nan, nan, 0.0, nan, 0.0, nan, 0.196409676360054, nan, 0.3126647596612261, nan, 0.45378078640357195, 0.1961058567559167, nan, 0.4348653304102725, 0.0, 0.09971110666170876, nan, nan, nan, 0.0, 0.027258874654325863, nan, 0.00018401315972899881, nan, nan, nan, nan, 0.0, 0.0, 0.0] [0.962685361139455, nan, nan, 0.008580406058652916, 0.0, 0.0, nan, nan, 0.6082319559048954, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.011183084208523974, nan, nan, nan, 0.0, 0.0, 0.4723470739411505, 0.0, nan, nan, nan, nan, nan, 0.025122708483377616, nan, nan, nan, 0.0, 0.0, nan, 0.0008677144728225279, nan, 0.18170582881727548, 0.08256384100531004, 0.8053247514907578, 0.0055721815258174025, nan, nan, 0.693375366697316, nan, 0.0, nan, 0.0, nan, 0.8249390957739768, 0.9847885255882247, nan, nan, 0.0, nan, nan, nan, 0.6429491205034031, 0.025510181303546157, nan, nan, 0.2607166013380095, 0.0, nan, 0.6117796668032227, nan, nan, 0.0, nan, 0.0, nan, 0.45981429755101694, nan, 0.7929869483267542, nan, 0.9546115622348806, 0.4176112522900538, nan, 0.8931431511915865, 0.0, 0.10573889832564912, nan, nan, nan, 0.0, 0.02991008397116742, nan, 0.00018401315972899881, nan, nan, nan, nan, 0.0, 0.0, 0.0]
1.2604 4.0 320 1.6147 0.1360 0.2333 0.6191 [0.9088378924504633, nan, nan, 0.006172723690876001, 0.0, 0.0, nan, nan, 0.2700373922939359, 0.0, 0.0008555624667843959, nan, nan, 0.0, 4.0354892454211606e-05, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.02426232674940841, nan, nan, nan, 0.0, 0.0, 0.7289752537055415, 0.0, nan, nan, nan, nan, nan, 0.28889040191898774, nan, nan, nan, 0.0, 0.0, nan, 0.16201117318435754, nan, 0.135735470771455, 0.03517806587907951, 0.15354830524143917, 0.006966164344611885, nan, nan, 0.32221034023700923, nan, 0.0, nan, 0.0, nan, 0.22444128114761158, 0.8783578551511888, nan, nan, 0.0, nan, nan, 0.0, 0.40173180636428096, 0.0, nan, nan, 0.23734584573654993, 0.0, 0.0, 0.22104203752331683, nan, nan, 0.0, nan, 0.0, nan, 0.23749390906705647, nan, 0.3526015471706624, nan, 0.5776847601044198, 0.23668970134621362, nan, 0.45186616741152313, 0.0, 0.03700022196272956, nan, nan, nan, 0.0, 0.03473255140621984, nan, 0.0006468341372292079, nan, nan, nan, nan, 0.0, 0.0, 0.0] [0.9619262964873231, nan, nan, 0.0066266516274572996, 0.0, 0.0, nan, nan, 0.48755748344802846, 0.0, 0.0009753102518962857, nan, nan, 0.0, 0.00010248751848435602, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.025149424827047646, nan, nan, nan, 0.0, 0.0, 0.7734101690309871, 0.0, nan, nan, nan, nan, nan, 0.33383364905676366, nan, nan, nan, 0.0, 0.0, nan, 0.16665029469548134, nan, 0.19807998593472056, 0.038453234371039764, 0.5908759373985942, 0.007113185635161693, nan, nan, 0.41136970652273425, nan, 0.0, nan, 0.0, nan, 0.927765410498326, 0.9786645417755971, nan, nan, 0.0, nan, nan, nan, 0.7116025043292926, 0.0, nan, nan, 0.3043050602693702, 0.0, nan, 0.5754642905912877, nan, nan, 0.0, nan, 0.0, nan, 0.5412278229078578, nan, 0.8339912514669796, nan, 0.933341413162041, 0.5849890668400213, nan, 0.9625317595600296, 0.0, 0.038764862897355014, nan, nan, nan, 0.0, 0.037886106363478736, nan, 0.0006468341372292079, nan, nan, nan, nan, 0.0, 0.0, 0.0]
1.0751 5.0 400 1.4502 0.1533 0.2610 0.6364 [0.9124349689545805, nan, nan, 0.02800772200772201, 0.0, 0.0, nan, nan, 0.2607173969525149, 0.0, 0.016277630093587268, nan, nan, 0.0, 0.2270782252376314, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.06811442675213822, nan, nan, nan, 0.0, 0.0, 0.8687202091128081, 0.0, nan, nan, nan, nan, nan, 0.2974419443865597, nan, nan, nan, 0.0, 0.0, nan, 0.2241877764060571, nan, 0.1515213091365017, 0.11467393720272052, 0.15593087165983874, 0.06270734404475378, nan, nan, 0.3909308803524001, nan, 0.00036651405428684, nan, 0.0, nan, 0.2868722957605033, 0.8932905474743855, nan, nan, 0.0, nan, nan, 0.0, 0.4606306663381574, 0.0, nan, nan, 0.2746428721042269, 0.0, 0.0, 0.2902813189084895, nan, nan, 0.0, nan, 0.0, nan, 0.23567975453556203, nan, 0.2633975673362132, nan, 0.6213946929880785, 0.19223303523308344, nan, 0.3910326336956379, 0.0, 0.0782475879668719, nan, nan, nan, 0.0, 0.04951054808145317, nan, 0.0039163982487469195, nan, nan, nan, nan, 0.0, 0.0, 0.0] [0.9723338074467475, nan, nan, 0.036527151144054144, 0.0, 0.0, nan, nan, 0.5563745474707282, 0.0, 0.024287670339701375, nan, nan, 0.0, 0.4949268678350244, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.07957310052761732, nan, nan, nan, 0.0, 0.0, 0.9594801136685271, 0.0, nan, nan, nan, nan, nan, 0.35418721895066846, nan, nan, nan, 0.0, 0.0, nan, 0.23487229862475442, nan, 0.18122750589500683, 0.1695115196991079, 0.6156271304513505, 0.06458817223512596, nan, nan, 0.6005278053091004, nan, 0.00036653644441322095, nan, 0.0, nan, 0.704683029047025, 0.981986176270458, nan, nan, 0.0, nan, nan, nan, 0.8182902307053099, 0.0, nan, nan, 0.41058324246264477, 0.0, nan, 0.5752509217533798, nan, nan, 0.0, nan, 0.0, nan, 0.4514089981122244, nan, 0.9410896546818877, nan, 0.9400072718458369, 0.5185509130666036, nan, 0.9568874023092014, 0.0, 0.08339399822049665, nan, nan, nan, 0.0, 0.05856927002055931, nan, 0.003925614074218641, nan, nan, nan, nan, 0.0, 0.0, 0.0]
0.8905 6.0 480 1.4667 0.1800 0.2730 0.6443 [0.905905633929436, nan, nan, 0.024166791343606712, 0.0, 0.0, nan, nan, 0.24485564712590058, 0.0, 0.015787326198173315, nan, nan, 0.0, 0.06429772737073612, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.03667124940377655, nan, nan, nan, 0.0, 0.0, 0.8804276721466182, 0.0, nan, nan, nan, nan, nan, 0.3332586108924466, nan, nan, nan, 0.0, 0.0, nan, 0.4213651736237913, nan, 0.26751989465534015, 0.055711846291814596, 0.13013760175315084, 0.24910351422424098, nan, nan, 0.3617097863839749, nan, 0.0014722187436048589, nan, 0.0, nan, 0.2951801534066646, 0.8831103091857646, nan, nan, 0.0, nan, nan, 0.0, 0.4552010963960052, 0.0005173340957204218, nan, nan, 0.33631148566193747, 0.0, 0.0, 0.2316926931106472, nan, nan, 0.0, nan, 0.0, nan, 0.28006018253711673, nan, 0.31920078586451395, nan, 0.6357359156101294, 0.4996092528915286, nan, 0.7065472845246002, 0.0, 0.25958851917703835, nan, nan, nan, 0.0, 0.03625532263347527, nan, 0.24853846153846154, nan, nan, nan, nan, 0.0, 0.0, 0.0] [0.9731492877427024, nan, nan, 0.02765468901063487, 0.0, 0.0, nan, nan, 0.46221584423208634, 0.0, 0.028696942035600182, nan, nan, 0.0, 0.130979048623007, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.03925592943744181, nan, nan, nan, 0.0, 0.0, 0.9294892841373532, 0.0, nan, nan, nan, nan, nan, 0.3868933419678144, nan, nan, nan, 0.0, 0.0, nan, 0.4344793713163065, nan, 0.376093678897944, 0.07095183410882307, 0.44409844508682567, 0.29089244237984635, nan, nan, 0.6039668987258636, nan, 0.0014722547183931042, nan, 0.0, nan, 0.695893863979281, 0.968431042509759, nan, nan, 0.0, nan, nan, nan, 0.8511801683508091, 0.0013572095058239468, nan, nan, 0.3665085025492912, 0.0, nan, 0.5919961081523966, nan, nan, 0.0, nan, 0.0, nan, 0.5771724367680598, nan, 0.9013656246665955, nan, 0.9261665252696643, 0.8311920099284912, nan, 0.889541054256585, 0.0, 0.2789978160640621, nan, nan, nan, 0.0, 0.04386812315771222, nan, 0.2522318565812585, nan, nan, nan, nan, 0.0, 0.0, 0.0]
0.9353 7.0 560 1.4553 0.1798 0.2827 0.6483 [0.9107786868458373, nan, nan, 0.0014616526623505823, 0.0, 0.0, nan, nan, 0.273034108979976, 0.0, 0.055445239451347134, nan, nan, 0.0, 0.14374652321603995, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.05923873957861652, nan, nan, nan, 0.0, 0.0, 0.8647824910018748, 0.0, nan, nan, nan, nan, nan, 0.34473373360840903, nan, nan, nan, 0.0, 0.0, nan, 0.48846614301982655, nan, 0.21764580015718807, 0.036326081530570146, 0.15283853725792182, 0.08106948659924282, nan, nan, 0.3613395456882474, nan, 0.006905334913324456, nan, 0.0, nan, 0.2762568515044489, 0.9008219583837287, nan, nan, 0.0, nan, nan, 0.0, 0.46352166564119973, 0.0, nan, nan, 0.33000840465580095, 0.0, 0.0, 0.2757454007189681, nan, nan, 0.0, nan, 0.0, nan, 0.304404558768512, nan, 0.35536825831649466, nan, 0.6770693433236729, 0.3916188194209899, nan, 0.5843015698430157, 0.0, 0.2347843086146227, nan, nan, nan, 0.0, 0.05726667371313814, nan, 0.3212099543883598, nan, nan, nan, nan, 0.0, 0.0, 0.0] [0.966932793101307, nan, nan, 0.0019235417338059942, 0.0, 0.0, nan, nan, 0.5487753171781742, 0.0, 0.08700908478407338, nan, nan, 0.0, 0.35563168914071536, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0667981538399928, nan, nan, nan, 0.0, 0.0, 0.9765530358780441, 0.0, nan, nan, nan, nan, nan, 0.3832862269940507, nan, nan, nan, 0.0, 0.0, nan, 0.5477570399476097, nan, 0.2735102180118314, 0.044630033512420876, 0.69072619178751, 0.0844090584241558, nan, nan, 0.5399058229742586, nan, 0.006906157506819105, nan, 0.0, nan, 0.6752517502760224, 0.9802403036922966, nan, nan, 0.0, nan, nan, nan, 0.8314969520511015, 0.0, nan, nan, 0.4498614835990687, 0.0, nan, 0.5898538850198006, nan, nan, 0.0, nan, 0.0, nan, 0.569920303410809, nan, 0.9319143639531989, nan, 0.9287116713125682, 0.7263636900892382, nan, 0.9396970379185026, 0.0, 0.25500485319097305, nan, nan, nan, 0.0, 0.06710262317009735, nan, 0.33064376725123373, nan, nan, nan, nan, 0.0, 0.0, 0.0]
0.5169 8.0 640 1.5020 0.1745 0.2703 0.6444 [0.9081863782011966, nan, nan, 0.0032025739757349163, 0.0, 0.0, nan, nan, 0.2805276118120908, 0.0, 0.0194719730356517, nan, nan, 0.0, 0.028064115049397008, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.031044880309479255, nan, nan, nan, 0.0, 0.0, 0.8884815641270853, 0.0, nan, nan, nan, nan, nan, 0.3379070768347919, nan, nan, nan, 0.0, 0.0, nan, 0.34515052234654137, nan, 0.19105767061023177, 0.05652715709865933, 0.1547500666185713, 0.03326140899773438, nan, nan, 0.31290392129023437, nan, 0.0016066514146779519, nan, 0.0, nan, 0.2962907055537498, 0.901996445252759, nan, nan, 0.0, nan, nan, 0.0, 0.47591977144189845, 0.0, nan, nan, 0.28169751673196775, 0.0, 0.0, 0.25067266318048587, nan, nan, 0.0, nan, 0.0, nan, 0.3265994118153242, nan, 0.3129466942416583, nan, 0.6767515781788395, 0.4129130725781734, nan, 0.604027616749705, 0.0, 0.2700754463285277, nan, nan, nan, 0.0008216176739090743, 0.04560087450984325, nan, 0.451933678002205, nan, nan, nan, nan, 0.0, 0.0, 0.0] [0.9717208749831292, nan, nan, 0.0043304866258459555, 0.0, 0.0, nan, nan, 0.47930595870976156, 0.0, 0.028376366521049315, nan, nan, 0.0, 0.06708540138504561, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.03205750728352172, nan, nan, nan, 0.0, 0.0, 0.9466527788784856, 0.0, nan, nan, nan, nan, nan, 0.39914300758341453, nan, nan, nan, 0.0, 0.0, nan, 0.3656516044531762, nan, 0.22985872667852564, 0.07409342799836037, 0.7088355864639494, 0.034426925138467034, nan, nan, 0.4737032747015671, nan, 0.0016066514146779519, nan, 0.0, nan, 0.6888620195866966, 0.9768828564266018, nan, nan, 0.0, nan, nan, nan, 0.8284331449377407, 0.0, nan, nan, 0.3784923815979488, 0.0, nan, 0.6050969547999454, nan, nan, 0.0, nan, 0.0, nan, 0.5511450512091159, nan, 0.944432590063658, nan, 0.921197430614471, 0.7802375746114296, nan, 0.9299359984562442, 0.0, 0.2924451993852625, nan, nan, nan, 0.0012186865267433988, 0.05631517673577568, nan, 0.4708617949647308, nan, nan, nan, nan, 0.0, 0.0, 0.0]
0.6238 9.0 720 1.4849 0.1839 0.2801 0.6487 [0.9122554015886276, nan, nan, 0.002470252841867913, 0.0, 0.0, nan, nan, 0.3127948297001309, 0.0, 0.06578942909523439, nan, nan, 0.0, 0.07320344647282151, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.13295915079782272, nan, nan, nan, 0.0, 0.0, 0.9095009116013264, 0.0, nan, nan, nan, nan, nan, 0.3742842162617841, nan, nan, nan, 0.0, 0.0, nan, 0.4058788598574822, nan, 0.28061300308301534, 0.060087719298245613, 0.13627408546141104, 0.05961007500185125, nan, nan, 0.32234386065996856, nan, 0.012001014084162877, nan, 0.0, nan, 0.2904958016703202, 0.8916012123977447, nan, nan, 0.0, nan, nan, 0.0, 0.4608265384852228, 5.152005632859492e-05, nan, nan, 0.30818675778624244, 0.0, 0.0, 0.274219176132995, nan, nan, 0.0, nan, 0.0, nan, 0.3185477315272881, nan, 0.3267800277869991, nan, 0.6388361638361638, 0.4157944612599361, nan, 0.5833592894390467, 0.0, 0.27790195064282547, nan, nan, nan, 0.0, 0.07508638303790156, nan, 0.45796774641715915, nan, nan, nan, nan, 0.0, 0.0, 0.0] [0.9681510184101092, nan, nan, 0.0034643893006767643, 0.0, 0.0, nan, nan, 0.4909167998434493, 0.0, 0.10548564473712808, nan, nan, 0.0, 0.18857703401121506, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.15700369432235717, nan, nan, nan, 0.0, 0.0, 0.9657635830493507, 0.0, nan, nan, nan, nan, nan, 0.4406853518450151, nan, nan, nan, 0.0, 0.0, nan, 0.4699901768172888, nan, 0.3466445000620527, 0.08273594027216717, 0.5980029949122216, 0.0629243344648919, nan, nan, 0.42755414382893697, nan, 0.012001014084162877, nan, 0.0, nan, 0.6606168041868293, 0.9797299717079111, nan, nan, 0.0, nan, nan, nan, 0.8069864951442146, 0.0001071481188808379, nan, nan, 0.405363089799888, 0.0, nan, 0.5921241294551414, nan, nan, 0.0, nan, 0.0, nan, 0.5857742015392632, nan, 0.945197197624382, nan, 0.930032723306266, 0.6949352875125584, nan, 0.9336506609204644, 0.0, 0.3042242983094718, nan, nan, nan, 0.0, 0.08693121299943028, nan, 0.4779044804416316, nan, nan, nan, nan, 0.0, 0.0, 0.0]
0.84 10.0 800 1.5204 0.1825 0.2798 0.6420 [0.9142691288036651, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.3142004341534009, 0.0, 0.003682913518478972, nan, nan, 0.0, 0.09172351933896877, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.09730988810230647, nan, nan, nan, 0.0, 0.0, 0.9051505354829877, 0.0, nan, nan, nan, nan, nan, 0.3923942271862065, nan, nan, nan, 0.0, 0.0, nan, 0.35265901085368995, nan, 0.19254235491492325, 0.049966760972743995, 0.11681963220492537, 0.10314154531218374, nan, nan, 0.3096201109834452, nan, 0.00769115639193742, nan, 0.0, nan, 0.28766783984147737, 0.8922548763781069, nan, nan, 0.0, nan, nan, 0.0, 0.45913823493730865, 0.00039839497899743395, nan, nan, 0.3294171958059549, 0.0, 0.0, 0.2671846076829642, nan, nan, 0.0, nan, 0.0, nan, 0.3159265837773831, nan, 0.331420855139111, nan, 0.6764366245603519, 0.45869407404577234, nan, 0.6076370770804199, 0.0, 0.28964539522710336, nan, nan, nan, 0.01132884262094305, 0.07350650765125524, nan, 0.45421384725808006, nan, nan, nan, nan, 0.0, 0.0, 0.0] [0.9647457074649063, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.5664851113792766, 0.0, 0.005903479602703702, nan, nan, 0.0, 0.24703884276950558, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.10665478609973669, nan, nan, nan, 0.0, 0.0, 0.9731339228095599, 0.0, nan, nan, nan, nan, nan, 0.47161212620060644, nan, nan, nan, 0.0, 0.0, nan, 0.40322527832351013, nan, 0.2442109998758946, 0.07079225115228281, 0.4617999839239304, 0.10811595497587993, nan, nan, 0.4783842629632726, nan, 0.00769115639193742, nan, 0.0, nan, 0.7675853747883402, 0.9796493929735344, nan, nan, 0.0, nan, nan, nan, 0.8162476926297361, 0.0008705784659068079, nan, nan, 0.4390048922813946, 0.0, nan, 0.5876177795985252, nan, nan, 0.0, nan, 0.0, nan, 0.5925394425509742, nan, 0.947028699455884, nan, 0.930032723306266, 0.7094970746409787, nan, 0.9363039912520503, 0.0, 0.32225188061150206, nan, nan, nan, 0.015030467163168585, 0.08358722845606995, nan, 0.47285248278361725, nan, nan, nan, nan, 0.0, 0.0, 0.0]

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
3
Safetensors
Model size
27.4M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for MF21377197/segformer-b2-finetuned

Base model

nvidia/mit-b2
Finetuned
(13)
this model