DinoVd'eau is a fine-tuned version of facebook/dinov2-small. It achieves the following results on the test set:
- Loss: 0.1320
- F1 Micro: 0.8009
- F1 Macro: 0.6614
- Roc Auc: 0.8649
- Accuracy: 0.2903
Model description
DinoVd'eau is a model built on top of dinov2 model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
The source code for training the model can be found in this Git repository.
- Developed by: lombardata, credits to César Leblanc and Victor Illien
Intended uses & limitations
You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
Training and evaluation data
Details on the number of images for each class are given in the following table:
Class | train | val | test | Total |
---|---|---|---|---|
Acropore_branched | 1469 | 464 | 475 | 2408 |
Acropore_digitised | 568 | 160 | 160 | 888 |
Acropore_sub_massive | 150 | 50 | 43 | 243 |
Acropore_tabular | 999 | 297 | 293 | 1589 |
Algae_assembly | 2546 | 847 | 845 | 4238 |
Algae_drawn_up | 367 | 126 | 127 | 620 |
Algae_limestone | 1652 | 557 | 563 | 2772 |
Algae_sodding | 3148 | 984 | 985 | 5117 |
Atra/Leucospilota | 1084 | 348 | 360 | 1792 |
Bleached_coral | 219 | 71 | 70 | 360 |
Blurred | 191 | 67 | 62 | 320 |
Dead_coral | 1979 | 642 | 643 | 3264 |
Fish | 2018 | 656 | 647 | 3321 |
Homo_sapiens | 161 | 62 | 59 | 282 |
Human_object | 157 | 58 | 55 | 270 |
Living_coral | 406 | 154 | 141 | 701 |
Millepore | 385 | 127 | 125 | 637 |
No_acropore_encrusting | 441 | 130 | 154 | 725 |
No_acropore_foliaceous | 204 | 36 | 46 | 286 |
No_acropore_massive | 1031 | 336 | 338 | 1705 |
No_acropore_solitary | 202 | 53 | 48 | 303 |
No_acropore_sub_massive | 1401 | 433 | 422 | 2256 |
Rock | 4489 | 1495 | 1473 | 7457 |
Rubble | 3092 | 1030 | 1001 | 5123 |
Sand | 5842 | 1939 | 1938 | 9719 |
Sea_cucumber | 1408 | 439 | 447 | 2294 |
Sea_urchins | 327 | 107 | 111 | 545 |
Sponge | 269 | 96 | 105 | 470 |
Syringodium_isoetifolium | 1212 | 392 | 391 | 1995 |
Thalassodendron_ciliatum | 782 | 261 | 260 | 1303 |
Useless | 579 | 193 | 193 | 965 |
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- Number of Epochs: 150
- Learning Rate: 0.001
- Train Batch Size: 32
- Eval Batch Size: 32
- Optimizer: Adam
- LR Scheduler Type: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
- Freeze Encoder: Yes
- Data Augmentation: Yes
Data Augmentation
Data were augmented using the following transformations :
Train Transforms
- PreProcess: No additional parameters
- Resize: probability=1.00
- RandomHorizontalFlip: probability=0.25
- RandomVerticalFlip: probability=0.25
- ColorJiggle: probability=0.25
- RandomPerspective: probability=0.25
- Normalize: probability=1.00
Val Transforms
- PreProcess: No additional parameters
- Resize: probability=1.00
- Normalize: probability=1.00
Training results
Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate |
---|---|---|---|---|---|
1 | 0.19568666815757751 | 0.19057519057519057 | 0.7088941673264713 | 0.4058921954514261 | 0.001 |
2 | 0.17198018729686737 | 0.21933471933471935 | 0.738139514768845 | 0.4867943512801917 | 0.001 |
3 | 0.16209888458251953 | 0.23215523215523215 | 0.7578947368421052 | 0.5587016500092944 | 0.001 |
4 | 0.15948981046676636 | 0.22487872487872487 | 0.7463059684835497 | 0.5561953540051209 | 0.001 |
5 | 0.15691693127155304 | 0.23146223146223147 | 0.7510718113612004 | 0.5723046956548954 | 0.001 |
6 | 0.15302371978759766 | 0.2363132363132363 | 0.7634727923836142 | 0.5786669115862841 | 0.001 |
7 | 0.1523299366235733 | 0.23354123354123354 | 0.7651630269613162 | 0.5981729145672101 | 0.001 |
8 | 0.15311872959136963 | 0.24185724185724186 | 0.7655172413793103 | 0.587992292024695 | 0.001 |
9 | 0.14992575347423553 | 0.24012474012474014 | 0.7699542669773061 | 0.606908576330327 | 0.001 |
10 | 0.1509619951248169 | 0.24393624393624394 | 0.7606115107913669 | 0.5829080312220596 | 0.001 |
11 | 0.1520717293024063 | 0.2505197505197505 | 0.7689559002963221 | 0.5976223089766404 | 0.001 |
12 | 0.15027731657028198 | 0.2442827442827443 | 0.7759986516096409 | 0.607405900640871 | 0.001 |
13 | 0.1504218876361847 | 0.24393624393624394 | 0.7623558852444365 | 0.6003271512523337 | 0.001 |
14 | 0.1496724784374237 | 0.24462924462924462 | 0.7644358114073813 | 0.602811285040826 | 0.001 |
15 | 0.14749661087989807 | 0.2512127512127512 | 0.7751615281210703 | 0.6066013767027806 | 0.001 |
16 | 0.14998775720596313 | 0.24636174636174638 | 0.7645565108923241 | 0.5838354990739413 | 0.001 |
17 | 0.15297245979309082 | 0.24566874566874566 | 0.7719883641341547 | 0.6073459016890155 | 0.001 |
18 | 0.14907290041446686 | 0.24393624393624394 | 0.7751951282271207 | 0.614324753279198 | 0.001 |
19 | 0.14951026439666748 | 0.23458073458073458 | 0.7739734788726388 | 0.6075499214740471 | 0.001 |
20 | 0.14873762428760529 | 0.24532224532224534 | 0.7636993911381718 | 0.595638442008225 | 0.001 |
21 | 0.14705629646778107 | 0.24740124740124741 | 0.780452718426063 | 0.6164990545073296 | 0.001 |
22 | 0.1508719027042389 | 0.24532224532224534 | 0.7753641707130079 | 0.6073576225776433 | 0.001 |
23 | 0.15015815198421478 | 0.2428967428967429 | 0.771920553133395 | 0.6127152502703448 | 0.001 |
24 | 0.14965225756168365 | 0.24012474012474014 | 0.7698941591532732 | 0.5849380548549015 | 0.001 |
25 | 0.14702074229717255 | 0.24255024255024255 | 0.7761348897535668 | 0.6035289549510865 | 0.001 |
26 | 0.14808295667171478 | 0.24220374220374222 | 0.7751430907604253 | 0.6064603919289959 | 0.001 |
27 | 0.14581289887428284 | 0.24740124740124741 | 0.7689308343302761 | 0.6135774018658996 | 0.001 |
28 | 0.1453842669725418 | 0.24462924462924462 | 0.7751325049960902 | 0.6077297645661711 | 0.001 |
29 | 0.14941243827342987 | 0.24566874566874566 | 0.7735191637630662 | 0.6107922701154117 | 0.001 |
30 | 0.14549985527992249 | 0.24982674982674982 | 0.7705324709843182 | 0.5982833860845571 | 0.001 |
31 | 0.14541107416152954 | 0.2532917532917533 | 0.7784728768532008 | 0.6068619458731248 | 0.001 |
32 | 0.14657220244407654 | 0.24532224532224534 | 0.7746102833519939 | 0.6145316287096297 | 0.001 |
33 | 0.14459234476089478 | 0.253984753984754 | 0.777031154551008 | 0.6124691593400795 | 0.001 |
34 | 0.1468168944120407 | 0.24462924462924462 | 0.7781283769180896 | 0.6168054796129936 | 0.001 |
35 | 0.14858707785606384 | 0.2494802494802495 | 0.7766880749869814 | 0.6193343400891848 | 0.001 |
36 | 0.14637114107608795 | 0.24878724878724878 | 0.7718835224773468 | 0.6092667253949349 | 0.001 |
37 | 0.1448281705379486 | 0.24982674982674982 | 0.7733602776435442 | 0.6127183895875491 | 0.001 |
38 | 0.1450735628604889 | 0.25225225225225223 | 0.7814896880859042 | 0.6109962510638844 | 0.001 |
39 | 0.14469724893569946 | 0.24982674982674982 | 0.7824146207942057 | 0.6272196317832909 | 0.001 |
40 | 0.14824891090393066 | 0.25363825363825365 | 0.7836651178652115 | 0.6265963634718456 | 0.0001 |
41 | 0.14141727983951569 | 0.2616077616077616 | 0.7833456473553827 | 0.6323784470247855 | 0.0001 |
42 | 0.13979895412921906 | 0.26195426195426197 | 0.7884351407000686 | 0.6371841233046203 | 0.0001 |
43 | 0.14107641577720642 | 0.26403326403326405 | 0.7871061893724783 | 0.6366820358518588 | 0.0001 |
44 | 0.13898694515228271 | 0.26126126126126126 | 0.787878787878788 | 0.6256922069455233 | 0.0001 |
45 | 0.13859130442142487 | 0.2664587664587665 | 0.7894011202068074 | 0.6421056073559387 | 0.0001 |
46 | 0.139601469039917 | 0.2664587664587665 | 0.7873893327575039 | 0.6283048537279357 | 0.0001 |
47 | 0.13869330286979675 | 0.2636867636867637 | 0.7863567238757333 | 0.6286555138094179 | 0.0001 |
48 | 0.13777127861976624 | 0.26784476784476785 | 0.7913177234660741 | 0.6334934953582803 | 0.0001 |
49 | 0.1377096027135849 | 0.26403326403326405 | 0.7933989479042932 | 0.6381777921693204 | 0.0001 |
50 | 0.13755330443382263 | 0.2674982674982675 | 0.7918342891380639 | 0.6362718007605523 | 0.0001 |
51 | 0.13754987716674805 | 0.2661122661122661 | 0.7928808087673094 | 0.6426825970872383 | 0.0001 |
52 | 0.13771678507328033 | 0.26576576576576577 | 0.7871186146434616 | 0.6367912909960436 | 0.0001 |
53 | 0.13740690052509308 | 0.2692307692307692 | 0.7928592630284527 | 0.640555047060403 | 0.0001 |
54 | 0.1368684023618698 | 0.27165627165627165 | 0.7920979171140219 | 0.6412320555565514 | 0.0001 |
55 | 0.13703426718711853 | 0.2702702702702703 | 0.7914089347079037 | 0.6377616721633446 | 0.0001 |
56 | 0.1364637017250061 | 0.2643797643797644 | 0.7931107623128156 | 0.6425003998141597 | 0.0001 |
57 | 0.13675515353679657 | 0.2674982674982675 | 0.7926408585665006 | 0.6381793578718891 | 0.0001 |
58 | 0.1364695280790329 | 0.2674982674982675 | 0.791562634524322 | 0.637380953089336 | 0.0001 |
59 | 0.13641765713691711 | 0.2674982674982675 | 0.7922245108135942 | 0.6428884521567982 | 0.0001 |
60 | 0.13687649369239807 | 0.26507276507276506 | 0.7882888744307093 | 0.6357999016219877 | 0.0001 |
61 | 0.13638463616371155 | 0.2713097713097713 | 0.7945638702508654 | 0.6503848519713329 | 0.0001 |
62 | 0.13563227653503418 | 0.2751212751212751 | 0.7931640039405492 | 0.6441767594174573 | 0.0001 |
63 | 0.1355270892381668 | 0.27373527373527373 | 0.7966116124638174 | 0.6515952055035917 | 0.0001 |
64 | 0.13592010736465454 | 0.26784476784476785 | 0.7934075342465754 | 0.6450040026439422 | 0.0001 |
65 | 0.13569533824920654 | 0.27061677061677064 | 0.7936467053015668 | 0.64551501310817 | 0.0001 |
66 | 0.13565082848072052 | 0.2713097713097713 | 0.794643237940888 | 0.6477176853690674 | 0.0001 |
67 | 0.13533934950828552 | 0.27546777546777546 | 0.7965922095536813 | 0.6544361257862924 | 0.0001 |
68 | 0.1353396475315094 | 0.2733887733887734 | 0.7955772910907932 | 0.6519486064773884 | 0.0001 |
69 | 0.13474246859550476 | 0.26992376992376993 | 0.7966188524590164 | 0.6515714856354324 | 0.0001 |
70 | 0.13504748046398163 | 0.272002772002772 | 0.7944687795241776 | 0.6441608871918139 | 0.0001 |
71 | 0.13502468168735504 | 0.27234927234927236 | 0.7933057280883367 | 0.6441889860402124 | 0.0001 |
72 | 0.1344645917415619 | 0.2758142758142758 | 0.7969950486597234 | 0.6484748365424647 | 0.0001 |
73 | 0.1341526359319687 | 0.27616077616077617 | 0.7977006599957419 | 0.6518769914193778 | 0.0001 |
74 | 0.13499116897583008 | 0.2751212751212751 | 0.7914797229603171 | 0.641334935505441 | 0.0001 |
75 | 0.13461369276046753 | 0.2751212751212751 | 0.7946678133734681 | 0.6485229770180625 | 0.0001 |
76 | 0.13438266515731812 | 0.2758142758142758 | 0.7964594201659113 | 0.6478195810395848 | 0.0001 |
77 | 0.13460540771484375 | 0.27754677754677753 | 0.7977742853502102 | 0.6536737916153181 | 0.0001 |
78 | 0.13411369919776917 | 0.27754677754677753 | 0.7978169818504888 | 0.6543115985953537 | 0.0001 |
79 | 0.13399606943130493 | 0.2740817740817741 | 0.7953020134228188 | 0.6523004018612216 | 0.0001 |
80 | 0.1344238668680191 | 0.27823977823977825 | 0.7993085420355848 | 0.6545582038870168 | 0.0001 |
81 | 0.13405664265155792 | 0.2758142758142758 | 0.7966715529878418 | 0.6559691700651434 | 0.0001 |
82 | 0.13407430052757263 | 0.2765072765072765 | 0.7947541551246537 | 0.6453669674995801 | 0.0001 |
83 | 0.1350804716348648 | 0.2702702702702703 | 0.7924365020985678 | 0.645966570658811 | 0.0001 |
84 | 0.13387472927570343 | 0.27546777546777546 | 0.7957293542577825 | 0.6512285101875886 | 0.0001 |
85 | 0.13341927528381348 | 0.27927927927927926 | 0.7990622335890879 | 0.6531817491521362 | 0.0001 |
86 | 0.13337253034114838 | 0.2747747747747748 | 0.7988261313371896 | 0.6595866427349153 | 0.0001 |
87 | 0.1339845359325409 | 0.27442827442827444 | 0.7956179390619651 | 0.6467323251879672 | 0.0001 |
88 | 0.13357459008693695 | 0.2747747747747748 | 0.7981612326551459 | 0.648318545746826 | 0.0001 |
89 | 0.13366733491420746 | 0.2806652806652807 | 0.8014968675104065 | 0.6585340844298272 | 0.0001 |
90 | 0.1332736760377884 | 0.2772002772002772 | 0.8010798042854732 | 0.66211749340029 | 0.0001 |
91 | 0.13367226719856262 | 0.27823977823977825 | 0.7956933454403943 | 0.6528573832362276 | 0.0001 |
92 | 0.13348612189292908 | 0.27546777546777546 | 0.796086375587259 | 0.6513649424471982 | 0.0001 |
93 | 0.1330718696117401 | 0.2758142758142758 | 0.8001861094662043 | 0.6559763883082907 | 0.0001 |
94 | 0.13329002261161804 | 0.2758142758142758 | 0.7995090362720617 | 0.6553585917255438 | 0.0001 |
95 | 0.13314621150493622 | 0.2758142758142758 | 0.7979651162790697 | 0.6579543710907207 | 0.0001 |
96 | 0.13279949128627777 | 0.2751212751212751 | 0.7992523999660183 | 0.6556445954379041 | 0.0001 |
97 | 0.1332886964082718 | 0.27823977823977825 | 0.7977296181630549 | 0.6492741904723621 | 0.0001 |
98 | 0.13266970217227936 | 0.27546777546777546 | 0.799611141637432 | 0.6600105762308898 | 0.0001 |
99 | 0.13253149390220642 | 0.27165627165627165 | 0.7978809757764771 | 0.6589970862385839 | 0.0001 |
100 | 0.1329408884048462 | 0.27616077616077617 | 0.797143840330351 | 0.6570195655430786 | 0.0001 |
101 | 0.13274870812892914 | 0.28205128205128205 | 0.7991615690636095 | 0.657951499975745 | 0.0001 |
102 | 0.1326293796300888 | 0.2817047817047817 | 0.7986821274228745 | 0.654306822863844 | 0.0001 |
103 | 0.13247379660606384 | 0.2803187803187803 | 0.7993688968487486 | 0.6518495856500403 | 0.0001 |
104 | 0.13315415382385254 | 0.27754677754677753 | 0.8010850676047981 | 0.6612536009112525 | 0.0001 |
105 | 0.13218620419502258 | 0.2830907830907831 | 0.8012698412698412 | 0.6635718544409769 | 0.0001 |
106 | 0.13239973783493042 | 0.2830907830907831 | 0.800988243312319 | 0.6588128942023547 | 0.0001 |
107 | 0.13358280062675476 | 0.2785862785862786 | 0.7985513421389007 | 0.650564106362156 | 0.0001 |
108 | 0.13270235061645508 | 0.2796257796257796 | 0.7995554225623049 | 0.6501303094783896 | 0.0001 |
109 | 0.1318453699350357 | 0.2806652806652807 | 0.8000342553738118 | 0.6579556871315007 | 0.0001 |
110 | 0.13255637884140015 | 0.2803187803187803 | 0.7997274043785672 | 0.6582487839550253 | 0.0001 |
111 | 0.1319260448217392 | 0.2785862785862786 | 0.8012935069355799 | 0.6608614747058748 | 0.0001 |
112 | 0.13223350048065186 | 0.28101178101178104 | 0.8019278738426415 | 0.6595016342799644 | 0.0001 |
113 | 0.13213913142681122 | 0.27997227997227997 | 0.8024988392216453 | 0.6592029124671744 | 0.0001 |
114 | 0.13204564154148102 | 0.2823977823977824 | 0.8025030654094965 | 0.663088095209859 | 0.0001 |
115 | 0.1319342404603958 | 0.28378378378378377 | 0.8004266211604096 | 0.659797224924612 | 0.0001 |
116 | 0.13186337053775787 | 0.2844767844767845 | 0.8022295974810655 | 0.6627361818946377 | 1e-05 |
117 | 0.1317850947380066 | 0.28205128205128205 | 0.8012607547491268 | 0.6604165936303265 | 1e-05 |
118 | 0.13159342110157013 | 0.2796257796257796 | 0.8002395926924228 | 0.6590147410119703 | 1e-05 |
119 | 0.1319129317998886 | 0.28274428274428276 | 0.8036745185622182 | 0.6608406822787987 | 1e-05 |
120 | 0.13164088129997253 | 0.28135828135828134 | 0.803593372600534 | 0.6614581971670047 | 1e-05 |
121 | 0.13184630870819092 | 0.28101178101178104 | 0.8012604863092451 | 0.6610641151618838 | 1e-05 |
122 | 0.13215216994285583 | 0.2817047817047817 | 0.8049611099432415 | 0.6647378818356079 | 1e-05 |
123 | 0.13187836110591888 | 0.2817047817047817 | 0.8010107932156931 | 0.6604978306251739 | 1e-05 |
124 | 0.13141389191150665 | 0.2806652806652807 | 0.8018739352640545 | 0.6621515776947642 | 1e-05 |
125 | 0.13139639794826508 | 0.2862092862092862 | 0.804345987993574 | 0.6640721616133445 | 1e-05 |
126 | 0.13103623688220978 | 0.2862092862092862 | 0.804212663367593 | 0.663003919720051 | 1e-05 |
127 | 0.13152988255023956 | 0.28586278586278585 | 0.8038346213944846 | 0.6597731906072118 | 1e-05 |
128 | 0.13113313913345337 | 0.2869022869022869 | 0.8042412977357216 | 0.668197478893632 | 1e-05 |
129 | 0.13096605241298676 | 0.28274428274428276 | 0.8034694309287074 | 0.6652814888251478 | 1e-05 |
130 | 0.1310083270072937 | 0.28655578655578656 | 0.8034491503931017 | 0.6657375892895663 | 1e-05 |
131 | 0.13133247196674347 | 0.2834372834372834 | 0.8052362171687506 | 0.6709132204127336 | 1e-05 |
132 | 0.13149647414684296 | 0.2806652806652807 | 0.7985562048814026 | 0.6557913726655867 | 1e-05 |
133 | 0.1311328113079071 | 0.28794178794178793 | 0.8051816958277256 | 0.6689392948255155 | 1e-05 |
134 | 0.1308571696281433 | 0.28274428274428276 | 0.802060714437774 | 0.6648386499372343 | 1e-05 |
135 | 0.13148072361946106 | 0.2869022869022869 | 0.8038277511961722 | 0.6684163123065296 | 1e-05 |
136 | 0.13150115311145782 | 0.28274428274428276 | 0.8024591213764248 | 0.659009971789042 | 1e-05 |
137 | 0.1310679018497467 | 0.28586278586278585 | 0.8035592643051771 | 0.6666808903899752 | 1e-05 |
138 | 0.13124705851078033 | 0.2844767844767845 | 0.8035426731078905 | 0.6665598962110765 | 1e-05 |
139 | 0.13104070723056793 | 0.28967428967428965 | 0.8052538519828238 | 0.6661043989752415 | 1e-05 |
140 | 0.13169734179973602 | 0.2834372834372834 | 0.8020416843896214 | 0.663466069531375 | 1e-05 |
141 | 0.13089434802532196 | 0.2875952875952876 | 0.8046521463311481 | 0.6687691213000826 | 1.0000000000000002e-06 |
142 | 0.13103386759757996 | 0.28586278586278585 | 0.8041640110473762 | 0.6642894279153319 | 1.0000000000000002e-06 |
143 | 0.13144278526306152 | 0.2872487872487873 | 0.8019270122783083 | 0.6623287859816251 | 1.0000000000000002e-06 |
144 | 0.1311902105808258 | 0.28378378378378377 | 0.8024974515800204 | 0.6647534218687892 | 1.0000000000000002e-06 |
CO2 Emissions
The estimated CO2 emissions for training this model are documented below:
- Emissions: 1.6281850650290761 grams of CO2
- Source: Code Carbon
- Training Type: fine-tuning
- Geographical Location: Brest, France
- Hardware Used: NVIDIA Tesla V100 PCIe 32 Go
Framework Versions
- Transformers: 4.41.1
- Pytorch: 2.3.0+cu121
- Datasets: 2.19.1
- Tokenizers: 0.19.1
- Downloads last month
- 10
Model tree for lombardata/DinoVdeau-small-2024_08_31-batch-size32_epochs150_freeze
Base model
facebook/dinov2-small