--- license: other tags: - generated_from_trainer datasets: - scene_parse_150 model-index: - name: segformer-b0-scene-parse-150 results: [] --- # segformer-b0-scene-parse-150 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the scene_parse_150 dataset. It achieves the following results on the evaluation set: - Loss: 3.1029 - Mean Iou: 0.0672 - Mean Accuracy: 0.1367 - Overall Accuracy: 0.4128 - Per Category Iou: [0.3858791407535325, 0.25702625531588363, 0.5611418284685611, 0.47280742656097663, 0.2527102269172677, 0.5351371004393274, 0.003992863817857446, 0.2662973691283185, 0.1488603351955307, nan, 0.05782791238808235, 0.0, 0.4315975450994979, 0.0, 0.0, 0.020814223052419152, 0.09535044044427422, 0.0, nan, 0.20524703796244056, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan] - Per Category Accuracy: [0.7049932849661406, 0.9955495416575415, 0.7995236568497229, 0.6812617145300255, 0.8867424539396315, 0.6189054517166706, 0.009323084552442351, 0.35294816104805915, 0.3788818109430099, nan, 0.05818522364054643, 0.0, 0.5735363171292291, nan, 0.0, 0.0463854302174317, 0.09575089997230854, 0.0, nan, 0.2234163888401474, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan] ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 40 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:| | 3.0732 | 10.0 | 200 | 3.5232 | 0.0595 | 0.1213 | 0.3652 | [0.3285887558585248, 0.3448592327928497, 0.516508621114098, 0.4103236441665363, 0.226389572537605, 0.2614952568674889, 0.6633900837494594, 0.0110548371339692, 0.003205363478431085, nan, 0.0, 0.0, 0.3000085193388993, nan, 0.0, 0.03497180074096798, 0.041711537080910334, 0.0, nan, 0.009500875656742557, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan] | [0.7366498731604716, 0.9904655382786347, 0.8008213865860547, 0.6708364985627632, 0.9158918071344571, 0.27799620152919985, 0.8366972477064221, 0.012764315328062693, 0.003725400978273234, nan, 0.0, 0.0, 0.3481205051528557, nan, 0.0, 0.05508269838320015, 0.041711537080910334, 0.0, nan, 0.009519213897174942, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan] | | 2.7576 | 20.0 | 400 | 3.2542 | 0.0686 | 0.1383 | 0.4049 | [0.39161384162635726, 0.23707729603521277, 0.6137145047289452, 0.4244319935121405, 0.24615918455866406, 0.40915804138937223, 0.07740678719114179, 0.4436475255290586, 0.13389461918820084, nan, 0.0019058127288229098, 0.0, 0.5009867063741231, 0.0, 0.0, 0.03883114336144342, 0.0688017946958134, 0.0, nan, 0.05011565150346955, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan] | [0.6855540317442271, 0.995909332665895, 0.8311882624162201, 0.6353121266142671, 0.8898784790278322, 0.47956044264098846, 0.2246466650136375, 0.5820626737777426, 0.313417131156865, nan, 0.0019340784625094158, 0.0, 0.6901490250352174, nan, 0.0, 0.051514588366474635, 0.06888505994687343, 0.0, nan, 0.0513247938234778, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan] | | 2.0409 | 30.0 | 600 | 3.2176 | 0.0653 | 0.1305 | 0.3920 | [0.35913219689367404, 0.22562662278554396, 0.5966961279461279, 0.422749988380882, 0.24772297382894237, 0.5453257879529904, 0.05702906813411994, 0.1344788909327471, 0.1551746031746032, nan, 0.02179329421455666, 0.0, 0.4881223087715903, 0.0, 0.0, 0.039735575195803695, 0.11182700027101518, 0.0, nan, 0.0567978151403943, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan] | [0.6569505929836779, 0.9984435128116885, 0.7792332707369578, 0.6597317921369975, 0.8891885535084281, 0.6071315938860038, 0.17113811058765188, 0.14829363862145395, 0.3475145034694574, nan, 0.021814369185040414, 0.0, 0.6233472555173862, nan, 0.0, 0.06166140122653782, 0.11214526732510795, 0.0, nan, 0.058387436392349536, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan] | | 2.0448 | 40.0 | 800 | 3.1029 | 0.0672 | 0.1367 | 0.4128 | [0.3858791407535325, 0.25702625531588363, 0.5611418284685611, 0.47280742656097663, 0.2527102269172677, 0.5351371004393274, 0.003992863817857446, 0.2662973691283185, 0.1488603351955307, nan, 0.05782791238808235, 0.0, 0.4315975450994979, 0.0, 0.0, 0.020814223052419152, 0.09535044044427422, 0.0, nan, 0.20524703796244056, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan] | [0.7049932849661406, 0.9955495416575415, 0.7995236568497229, 0.6812617145300255, 0.8867424539396315, 0.6189054517166706, 0.009323084552442351, 0.35294816104805915, 0.3788818109430099, nan, 0.05818522364054643, 0.0, 0.5735363171292291, nan, 0.0, 0.0463854302174317, 0.09575089997230854, 0.0, nan, 0.2234163888401474, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan] | ### Framework versions - Transformers 4.27.4 - Pytorch 1.13.1+cu116 - Datasets 2.11.0 - Tokenizers 0.13.2