Edit model card

law-game-replace-finetune

This model is a fine-tuned version of PekingU/rtdetr_r50vd_coco_o365 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 5.6852
  • Map: 0.8547
  • Map 50: 0.9219
  • Map 75: 0.8855
  • Map Small: 0.6003
  • Map Medium: 0.6465
  • Map Large: 0.9074
  • Mar 1: 0.652
  • Mar 10: 0.9364
  • Mar 100: 0.9452
  • Mar Small: 0.779
  • Mar Medium: 0.924
  • Mar Large: 0.9818
  • Map Evidence: -1.0
  • Mar 100 Evidence: -1.0
  • Map Ambulance: 1.0
  • Mar 100 Ambulance: 1.0
  • Map Artificial Target: 0.8084
  • Mar 100 Artificial Target: 0.8727
  • Map Cartridge: 0.9022
  • Mar 100 Cartridge: 0.9681
  • Map Gun: 0.7068
  • Mar 100 Gun: 0.9667
  • Map Knife: 0.7732
  • Mar 100 Knife: 0.9111
  • Map Police: 0.9573
  • Mar 100 Police: 0.9818
  • Map Traffic: 0.835
  • Mar 100 Traffic: 0.9162

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 300
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Evidence Mar 100 Evidence Map Ambulance Mar 100 Ambulance Map Artificial Target Mar 100 Artificial Target Map Cartridge Mar 100 Cartridge Map Gun Mar 100 Gun Map Knife Mar 100 Knife Map Police Mar 100 Police Map Traffic Mar 100 Traffic
No log 1.0 35 134.6829 0.0007 0.0015 0.0001 0.0 0.0001 0.0025 0.0014 0.0168 0.0462 0.0 0.0686 0.0457 -1.0 -1.0 0.0 0.0 0.0 0.0023 0.0033 0.0203 0.0 0.0407 0.0001 0.1667 0.0009 0.0909 0.0004 0.0027
No log 2.0 70 65.3809 0.0021 0.0028 0.0022 0.0001 0.0003 0.0074 0.0229 0.0958 0.167 0.095 0.3103 0.1895 -1.0 -1.0 0.0 0.0 0.0 0.0114 0.0 0.0203 0.0002 0.2444 0.0007 0.4185 0.007 0.3364 0.0064 0.1378
No log 3.0 105 35.8052 0.0174 0.0218 0.0181 0.0018 0.0208 0.0403 0.1026 0.3488 0.4766 0.2068 0.3434 0.5788 -1.0 -1.0 0.002 0.6 0.0299 0.5614 0.0465 0.4087 0.0025 0.4889 0.0087 0.6963 0.0043 0.3727 0.0275 0.2081
No log 4.0 140 25.0458 0.1381 0.1523 0.1431 0.0303 0.1563 0.182 0.3056 0.7303 0.8222 0.3066 0.7579 0.9116 -1.0 -1.0 0.0486 0.99 0.1396 0.8114 0.5183 0.9478 0.0069 0.6519 0.1362 0.8556 0.0466 0.9909 0.0704 0.5081
No log 5.0 175 17.7820 0.3449 0.3761 0.3616 0.1626 0.3378 0.4356 0.4732 0.8209 0.8887 0.5946 0.834 0.9595 -1.0 -1.0 0.049 0.99 0.5783 0.8307 0.7758 0.9507 0.0242 0.8519 0.3644 0.8889 0.4966 0.9909 0.1259 0.7176
No log 6.0 210 12.9410 0.5642 0.6087 0.5938 0.2533 0.4341 0.6233 0.5662 0.8824 0.9137 0.6323 0.8828 0.9676 -1.0 -1.0 0.522 0.99 0.5584 0.8205 0.8377 0.9754 0.2365 0.8889 0.6276 0.9 0.907 0.9818 0.2599 0.8392
No log 7.0 245 10.9532 0.621 0.6648 0.6501 0.2326 0.5144 0.7328 0.557 0.8817 0.9294 0.7018 0.9104 0.9762 -1.0 -1.0 0.8543 0.99 0.7325 0.8534 0.8209 0.9783 0.1195 0.9148 0.4129 0.9222 0.8795 0.9727 0.527 0.8743
No log 8.0 280 10.0540 0.5983 0.6382 0.6263 0.2675 0.4634 0.7109 0.5338 0.8619 0.8908 0.7047 0.8926 0.9389 -1.0 -1.0 0.6275 0.99 0.7514 0.8477 0.8526 0.9739 0.3547 0.937 0.5777 0.9185 0.5453 0.7091 0.4788 0.8595
No log 9.0 315 9.1369 0.7135 0.7584 0.7387 0.3085 0.5685 0.8728 0.602 0.9018 0.9292 0.708 0.7784 0.9823 -1.0 -1.0 0.9209 0.99 0.7616 0.833 0.8367 0.9681 0.258 0.9037 0.7433 0.9296 0.9341 0.9909 0.54 0.8892
No log 10.0 350 8.3330 0.7455 0.7925 0.7791 0.3892 0.5906 0.8393 0.6105 0.9109 0.9418 0.7521 0.8131 0.9867 -1.0 -1.0 0.8772 0.99 0.7901 0.8602 0.8873 0.9797 0.4403 0.9407 0.6274 0.9259 0.884 0.9909 0.7124 0.9054
No log 11.0 385 7.8064 0.7386 0.7829 0.7659 0.386 0.5855 0.8299 0.598 0.9136 0.9447 0.7716 0.8992 0.9875 -1.0 -1.0 0.9646 0.99 0.797 0.8773 0.8289 0.9739 0.2469 0.9593 0.6911 0.9148 0.9411 0.9909 0.7008 0.9068
No log 12.0 420 7.3854 0.7433 0.7883 0.7675 0.4603 0.5849 0.8471 0.6029 0.9147 0.9389 0.7745 0.8155 0.9831 -1.0 -1.0 0.9646 0.99 0.8199 0.8602 0.7835 0.9754 0.2672 0.9222 0.6356 0.9296 0.9685 0.9909 0.7639 0.9041
No log 13.0 455 7.1834 0.7809 0.8401 0.7988 0.4502 0.6114 0.8633 0.6232 0.9163 0.9319 0.7806 0.814 0.9751 -1.0 -1.0 0.9495 0.99 0.7535 0.8625 0.8276 0.9725 0.5165 0.8963 0.7709 0.9259 0.8929 0.9818 0.7556 0.8946
No log 14.0 490 6.9868 0.7827 0.8428 0.8038 0.4806 0.6204 0.8808 0.6146 0.918 0.9335 0.7759 0.7932 0.9763 -1.0 -1.0 0.8188 0.99 0.8202 0.8727 0.8257 0.9551 0.5485 0.9074 0.7324 0.9222 0.9589 0.9909 0.7743 0.8959
32.4583 15.0 525 6.9190 0.7805 0.8338 0.8066 0.4936 0.595 0.8772 0.6148 0.9272 0.9467 0.7756 0.9095 0.9864 -1.0 -1.0 0.912 0.99 0.7913 0.8557 0.8613 0.9855 0.5283 0.963 0.6932 0.9296 0.8962 0.9909 0.7809 0.9122
32.4583 16.0 560 7.0608 0.7653 0.8279 0.7837 0.5397 0.5975 0.838 0.6004 0.9049 0.9226 0.7743 0.8751 0.9682 -1.0 -1.0 0.9569 0.99 0.7921 0.8455 0.7774 0.9174 0.4303 0.9 0.6478 0.9185 0.9515 0.9909 0.8008 0.8959
32.4583 17.0 595 6.5305 0.7648 0.8275 0.7858 0.4759 0.6282 0.866 0.6017 0.919 0.9327 0.7467 0.8837 0.977 -1.0 -1.0 0.982 0.99 0.8007 0.8648 0.8692 0.971 0.2549 0.9296 0.7184 0.9 0.9133 0.9818 0.8148 0.8919
32.4583 18.0 630 6.3597 0.7757 0.837 0.8076 0.5064 0.6482 0.863 0.5974 0.9229 0.9362 0.7693 0.8368 0.9779 -1.0 -1.0 0.8771 1.0 0.8099 0.867 0.9032 0.9696 0.2876 0.9222 0.7655 0.9074 0.9522 0.9818 0.8346 0.9054
32.4583 19.0 665 6.2977 0.8016 0.8679 0.8317 0.5793 0.6589 0.8646 0.6006 0.9256 0.9403 0.7879 0.9265 0.9749 -1.0 -1.0 0.9534 0.99 0.8131 0.8739 0.9071 0.9725 0.4297 0.9407 0.7494 0.9185 0.9567 0.9909 0.8015 0.8959
32.4583 20.0 700 6.2690 0.7843 0.8443 0.8135 0.5068 0.6505 0.887 0.6166 0.9212 0.9362 0.7759 0.8952 0.9767 -1.0 -1.0 0.9725 0.99 0.8173 0.8739 0.8936 0.9812 0.3364 0.9296 0.7669 0.9037 0.9347 0.9818 0.7686 0.8932
32.4583 21.0 735 6.3902 0.8137 0.8806 0.8417 0.7055 0.6578 0.8776 0.6087 0.9276 0.9445 0.7868 0.9097 0.9828 -1.0 -1.0 0.9807 0.99 0.8096 0.8739 0.9099 0.9696 0.5614 0.9741 0.6762 0.9074 0.9398 0.9909 0.8186 0.9054
32.4583 22.0 770 6.5328 0.8196 0.8927 0.8492 0.5932 0.626 0.876 0.6229 0.9181 0.9309 0.785 0.8935 0.9673 -1.0 -1.0 0.9881 0.99 0.8162 0.8705 0.8924 0.9638 0.6492 0.9333 0.7485 0.8704 0.8867 0.9818 0.7563 0.9068
32.4583 23.0 805 6.2433 0.8291 0.8944 0.8509 0.5805 0.6343 0.9025 0.64 0.9191 0.9392 0.8098 0.8845 0.981 -1.0 -1.0 1.0 1.0 0.8052 0.8705 0.9029 0.9623 0.5827 0.9333 0.7635 0.9222 0.9357 0.9818 0.8135 0.9041
32.4583 24.0 840 6.1554 0.8171 0.885 0.8449 0.6088 0.6328 0.8923 0.6174 0.9203 0.9316 0.7746 0.782 0.976 -1.0 -1.0 1.0 1.0 0.8049 0.8727 0.878 0.9667 0.5197 0.9148 0.75 0.8889 0.9508 0.9818 0.8159 0.8959
32.4583 25.0 875 5.9349 0.8258 0.8844 0.8498 0.6153 0.6481 0.8945 0.6112 0.9268 0.9361 0.7883 0.7972 0.9813 -1.0 -1.0 1.0 1.0 0.8164 0.8795 0.9086 0.9565 0.4921 0.9185 0.7741 0.9185 0.9465 0.9727 0.8429 0.9068
32.4583 26.0 910 6.3160 0.7895 0.8497 0.8183 0.5915 0.6343 0.8686 0.6264 0.9223 0.9318 0.7793 0.8842 0.9716 -1.0 -1.0 0.9497 0.99 0.7948 0.867 0.859 0.9493 0.3445 0.9333 0.7677 0.9 0.968 0.9818 0.8428 0.9014
32.4583 27.0 945 6.4135 0.8095 0.8772 0.838 0.5667 0.6294 0.876 0.6148 0.9118 0.9303 0.7603 0.876 0.9693 -1.0 -1.0 0.9317 0.98 0.8026 0.8648 0.8146 0.9246 0.5762 0.9481 0.7626 0.9111 0.9675 0.9818 0.8115 0.9014
32.4583 28.0 980 5.8669 0.8351 0.8972 0.8582 0.6038 0.6358 0.8918 0.6464 0.928 0.9365 0.795 0.8915 0.9783 -1.0 -1.0 0.9752 0.99 0.821 0.8739 0.8749 0.9754 0.6426 0.9444 0.7537 0.8741 0.9634 0.9909 0.8152 0.9068
5.1898 29.0 1015 5.9247 0.86 0.9246 0.882 0.6164 0.6296 0.9322 0.6408 0.9206 0.9326 0.7759 0.8898 0.9781 -1.0 -1.0 0.991 1.0 0.8174 0.8636 0.8344 0.9174 0.7097 0.9333 0.8459 0.9259 0.9812 0.9909 0.8404 0.8973
5.1898 30.0 1050 5.9808 0.8348 0.9006 0.8552 0.5928 0.6299 0.8899 0.6409 0.9203 0.928 0.7853 0.7907 0.9711 -1.0 -1.0 0.9901 0.99 0.8064 0.8693 0.8471 0.9449 0.7043 0.9222 0.7611 0.9 0.9621 0.9818 0.7727 0.8878
5.1898 31.0 1085 5.9738 0.8511 0.9076 0.8774 0.5968 0.6397 0.9095 0.644 0.9194 0.9335 0.7835 0.8017 0.9779 -1.0 -1.0 0.9835 1.0 0.8134 0.8739 0.8661 0.9362 0.744 0.9222 0.7586 0.9296 0.9647 0.9727 0.8277 0.9
5.1898 32.0 1120 6.0519 0.8567 0.9245 0.8833 0.6084 0.6568 0.9083 0.6565 0.9215 0.9332 0.7877 0.8841 0.9733 -1.0 -1.0 0.982 0.99 0.8148 0.8682 0.8808 0.9681 0.7649 0.9333 0.7794 0.8926 0.9557 0.9818 0.8196 0.8986
5.1898 33.0 1155 5.8607 0.8556 0.9226 0.8835 0.6277 0.6905 0.9001 0.6521 0.926 0.9366 0.7708 0.9035 0.9755 -1.0 -1.0 1.0 1.0 0.7983 0.8693 0.8764 0.9609 0.7527 0.9444 0.7659 0.8889 0.9675 0.9818 0.8285 0.9108
5.1898 34.0 1190 5.7650 0.851 0.9163 0.8812 0.5894 0.6515 0.9045 0.6564 0.9253 0.9367 0.7932 0.8781 0.9762 -1.0 -1.0 1.0 1.0 0.7983 0.875 0.8831 0.9609 0.667 0.9444 0.7993 0.8852 0.9675 0.9818 0.842 0.9095
5.1898 35.0 1225 5.6566 0.8525 0.9147 0.8784 0.5964 0.6244 0.905 0.6486 0.9316 0.9414 0.7726 0.9147 0.9757 -1.0 -1.0 0.9772 1.0 0.8076 0.8682 0.9067 0.9667 0.6843 0.9407 0.7884 0.9074 0.9812 0.9909 0.8225 0.9162
5.1898 36.0 1260 5.7578 0.8626 0.9252 0.89 0.6101 0.6392 0.9241 0.6555 0.9323 0.9426 0.785 0.9145 0.9808 -1.0 -1.0 0.9835 1.0 0.8056 0.8636 0.8808 0.9696 0.7216 0.963 0.8373 0.9074 0.9812 0.9909 0.8284 0.9041
5.1898 37.0 1295 5.7752 0.8529 0.9132 0.8854 0.6004 0.6418 0.9072 0.6458 0.9251 0.9412 0.7632 0.9092 0.9834 -1.0 -1.0 0.9772 1.0 0.8062 0.8693 0.8795 0.9594 0.7218 0.9407 0.7752 0.9222 0.9652 0.9818 0.8453 0.9149
5.1898 38.0 1330 5.8527 0.854 0.9169 0.8842 0.6171 0.6414 0.9136 0.6539 0.9288 0.9456 0.7921 0.9069 0.9876 -1.0 -1.0 0.991 1.0 0.8115 0.8773 0.8773 0.9652 0.6804 0.9667 0.8167 0.9185 0.9651 0.9818 0.8358 0.9095
5.1898 39.0 1365 5.6796 0.864 0.9314 0.8903 0.6112 0.6373 0.9218 0.6476 0.9308 0.9413 0.7787 0.9145 0.9782 -1.0 -1.0 1.0 1.0 0.8099 0.8705 0.9009 0.9652 0.706 0.9556 0.8127 0.8963 0.9812 0.9909 0.8375 0.9108
5.1898 40.0 1400 5.6425 0.8536 0.9205 0.8835 0.6029 0.6507 0.9146 0.6521 0.9329 0.9418 0.7882 0.9185 0.9792 -1.0 -1.0 0.9772 1.0 0.8058 0.8716 0.909 0.9696 0.6723 0.9593 0.7884 0.9074 0.9647 0.9727 0.8578 0.9122
5.1898 41.0 1435 5.6851 0.8526 0.9164 0.8808 0.6038 0.6471 0.916 0.6569 0.9325 0.9399 0.7945 0.9122 0.9793 -1.0 -1.0 1.0 1.0 0.8085 0.875 0.923 0.9754 0.6399 0.9407 0.7996 0.9037 0.9738 0.9818 0.8236 0.9027
5.1898 42.0 1470 5.6092 0.864 0.9291 0.893 0.6087 0.6473 0.9309 0.652 0.9334 0.9418 0.7832 0.9139 0.9805 -1.0 -1.0 1.0 1.0 0.8058 0.8705 0.9203 0.9681 0.6728 0.9667 0.8363 0.9 0.9647 0.9727 0.8484 0.9149
4.0823 43.0 1505 5.4824 0.8709 0.9372 0.8972 0.6011 0.6472 0.937 0.6562 0.9342 0.9434 0.7777 0.9143 0.9847 -1.0 -1.0 1.0 1.0 0.8119 0.8705 0.9112 0.9652 0.721 0.963 0.8411 0.9222 0.9652 0.9818 0.846 0.9014
4.0823 44.0 1540 5.5914 0.8642 0.9316 0.8912 0.6061 0.6487 0.9302 0.6586 0.9353 0.9424 0.7779 0.9139 0.9826 -1.0 -1.0 0.991 1.0 0.8078 0.8693 0.9284 0.971 0.6971 0.9667 0.8299 0.9037 0.9654 0.9818 0.8299 0.9041
4.0823 45.0 1575 5.6331 0.86 0.9296 0.8902 0.6114 0.6418 0.9152 0.6456 0.9336 0.9419 0.7856 0.9124 0.9809 -1.0 -1.0 1.0 1.0 0.8105 0.8773 0.9199 0.9696 0.7093 0.9593 0.7953 0.9 0.9653 0.9818 0.82 0.9054
4.0823 46.0 1610 5.6049 0.8607 0.9295 0.889 0.6164 0.6419 0.9181 0.6484 0.9316 0.9404 0.7568 0.9149 0.9799 -1.0 -1.0 0.991 1.0 0.8076 0.8716 0.9103 0.971 0.7084 0.9556 0.8046 0.9 0.9647 0.9727 0.838 0.9122
4.0823 47.0 1645 5.6549 0.8492 0.9148 0.8762 0.6114 0.6449 0.9051 0.6431 0.9325 0.9427 0.7934 0.9172 0.9801 -1.0 -1.0 0.9835 1.0 0.8064 0.8773 0.9207 0.9667 0.6572 0.963 0.7851 0.9074 0.9647 0.9727 0.8268 0.9122
4.0823 48.0 1680 5.5983 0.858 0.926 0.8891 0.6136 0.6419 0.9136 0.6499 0.9324 0.9438 0.7845 0.9192 0.9812 -1.0 -1.0 0.9835 1.0 0.8059 0.8739 0.9212 0.971 0.7215 0.963 0.773 0.9037 0.9589 0.9818 0.8419 0.9135
4.0823 49.0 1715 5.6368 0.8556 0.9223 0.8844 0.6091 0.6486 0.9069 0.6488 0.9349 0.9438 0.7895 0.9236 0.979 -1.0 -1.0 0.991 1.0 0.81 0.8739 0.9161 0.9696 0.7073 0.963 0.7632 0.9074 0.9577 0.9727 0.8439 0.9203
4.0823 50.0 1750 5.6852 0.8547 0.9219 0.8855 0.6003 0.6465 0.9074 0.652 0.9364 0.9452 0.779 0.924 0.9818 -1.0 -1.0 1.0 1.0 0.8084 0.8727 0.9022 0.9681 0.7068 0.9667 0.7732 0.9111 0.9573 0.9818 0.835 0.9162

Framework versions

  • Transformers 4.44.0.dev0
  • Pytorch 2.3.1+cu121
  • Tokenizers 0.19.1
Downloads last month
55
Safetensors
Model size
42.9M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for anastasispk/law-game-replace-finetune

Finetuned
(11)
this model