hackelle commited on
Commit
9b89b7f
1 Parent(s): 4c9f942

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -10
README.md CHANGED
@@ -6,22 +6,22 @@ colorTo: green
6
  sdk: static
7
  pinned: false
8
  license: mit
9
- short_description: Official Repository of Pretrained Models on BigEarthNet v2.0
10
  ---
11
 
12
  [TU Berlin](https://www.tu.berlin/) | [RSiM](https://rsim.berlin/) | [DIMA](https://www.dima.tu-berlin.de/menue/database_systems_and_information_management_group/) | [BigEarth](http://www.bigearth.eu/) | [BIFOLD](https://bifold.berlin/)
13
  :---:|:---:|:---:|:---:|:---:
14
  <a href="https://www.tu.berlin/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/tu-berlin-logo-long-red.svg" style="font-size: 1rem; height: 2em; width: auto" alt="TU Berlin Logo"/> | <a href="https://rsim.berlin/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/RSiM_Logo_1.png" style="font-size: 1rem; height: 2em; width: auto" alt="RSiM Logo"> | <a href="https://www.dima.tu-berlin.de/menue/database_systems_and_information_management_group/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/DIMA.png" style="font-size: 1rem; height: 2em; width: auto" alt="DIMA Logo"> | <a href="http://www.bigearth.eu/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/BigEarth.png" style="font-size: 1rem; height: 2em; width: auto" alt="BigEarth Logo"> | <a href="https://bifold.berlin/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/BIFOLD_Logo_farbig.png" style="font-size: 1rem; height: 2em; width: auto; margin-right: 1em" alt="BIFOLD Logo">
15
 
16
- # BigEarthNet v2.0 Pretrained Weights
17
- We provide pretrained weights for several different models.
18
- The weights for the best-performing model, based on the Macro Average Precision score on the recommended test split, have been uploaded.
19
- All models have been trained using: i) Sentinel-1 data only (S1), ii) Sentinel-2 data only (S2), or iii) both Sentinel-1 and Sentinel-2 (S1+S2) modalities together.
20
 
21
  The following bands were used to train the models:
22
- - For models using Sentinel-1 only: Sentinel-1 bands `["VH", "VV"]`
23
- - For models using Sentinel-2 only: Sentinel-2 10m bands and 20m bands `["B02", "B03", "B04", "B08", "B05", "B06", "B07", "B11", "B12", "B8A"]`
24
- - For models using Sentinel-1 and Sentinel-2: Sentinel-2 10m bands and 20m bands and Sentinel-1 bands = `["B02", "B03", "B04", "B08", "B05", "B06", "B07", "B11", "B12", "B8A", "VH", "VV"]`
25
 
26
 
27
  The multi-hot encoded output of the model indicates the predicted multi-label output.
@@ -38,7 +38,7 @@ The multi-hot encoded output relates to the following class labels sorted in alp
38
 
39
  ## Links
40
 
41
- | Model | Equivalent [`timm`](https://huggingface.co/docs/timm/en/index) model name | Sentinel-1 only | Sentinel-2 only | Sentinel-1 and Sentinel-2 |
42
  |:-----------------|:---------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------:|
43
  | ConvMixer-768/32 | `convmixer_768_32` | [ConvMixer-768/32 S1](https://huggingface.co/BIFOLD-BigEarthNetv2-0/convmixer_768_32-s1-v0.1.1) | [ConvMixer-768/32 S2](https://huggingface.co/BIFOLD-BigEarthNetv2-0/convmixer_768_32-s2-v0.1.1) | [ConvMixer-768/32 S1+S2](https://huggingface.co/BIFOLD-BigEarthNetv2-0/convmixer_768_32-all-v0.1.1) |
44
  | ConvNext v2 Base | `convnextv2_base` | [ConvNext v2 Base S1](https://huggingface.co/BIFOLD-BigEarthNetv2-0/convnextv2_base-s1-v0.1.1) | [ConvNext v2 Base S2](https://huggingface.co/BIFOLD-BigEarthNetv2-0/convnextv2_base-s2-v0.1.1) | [ConvNext v2 Base S1+S2](https://huggingface.co/BIFOLD-BigEarthNetv2-0/convnextv2_base-all-v0.1.1) |
@@ -54,7 +54,7 @@ The multi-hot encoded output relates to the following class labels sorted in alp
54
  ## Usage
55
 
56
  To use the model, download the codes that define the model architecture from the
57
- [official BigEarthNet v2.0 (reBEN) repository](https://git.tu-berlin.de/rsim/reben-training-scripts) and load the model
58
  using the code below. Note that [`configilm`](https://pypi.org/project/configilm/) is a requirement to use the
59
  code below.
60
 
 
6
  sdk: static
7
  pinned: false
8
  license: mit
9
+ short_description: Repository of Pretrained Model Weights on BigEarthNet v2.0
10
  ---
11
 
12
  [TU Berlin](https://www.tu.berlin/) | [RSiM](https://rsim.berlin/) | [DIMA](https://www.dima.tu-berlin.de/menue/database_systems_and_information_management_group/) | [BigEarth](http://www.bigearth.eu/) | [BIFOLD](https://bifold.berlin/)
13
  :---:|:---:|:---:|:---:|:---:
14
  <a href="https://www.tu.berlin/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/tu-berlin-logo-long-red.svg" style="font-size: 1rem; height: 2em; width: auto" alt="TU Berlin Logo"/> | <a href="https://rsim.berlin/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/RSiM_Logo_1.png" style="font-size: 1rem; height: 2em; width: auto" alt="RSiM Logo"> | <a href="https://www.dima.tu-berlin.de/menue/database_systems_and_information_management_group/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/DIMA.png" style="font-size: 1rem; height: 2em; width: auto" alt="DIMA Logo"> | <a href="http://www.bigearth.eu/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/BigEarth.png" style="font-size: 1rem; height: 2em; width: auto" alt="BigEarth Logo"> | <a href="https://bifold.berlin/"><img src="https://raw.githubusercontent.com/wiki/lhackel-tub/ConfigILM/static/imgs/BIFOLD_Logo_farbig.png" style="font-size: 1rem; height: 2em; width: auto; margin-right: 1em" alt="BIFOLD Logo">
15
 
16
+ # BigEarthNet v2.0 Pretrained Model Weights
17
+ We provide weights for several different pretrained models.
18
+ The model weights for the best-performing model, based on the macro average precision score on the recommended test split, have been uploaded.
19
+ All models have been trained using: i) BigEarthNet-S1 data only (S1), ii) BigEarthNet-S2 data only (S2), or iii) both BigEarthNet-S1 and -S2 (S1+S2) together.
20
 
21
  The following bands were used to train the models:
22
+ - For models using BigEarthNet-S1 only: Sentinel-1 bands `["VH", "VV"]`
23
+ - For models using BigEarthNet-S2 only: Sentinel-2 10m bands and 20m bands `["B02", "B03", "B04", "B08", "B05", "B06", "B07", "B11", "B12", "B8A"]`
24
+ - For models using BigEarthNet-S1 and -S2: Sentinel-2 10m bands and 20m bands and Sentinel-1 bands = `["B02", "B03", "B04", "B08", "B05", "B06", "B07", "B11", "B12", "B8A", "VH", "VV"]`
25
 
26
 
27
  The multi-hot encoded output of the model indicates the predicted multi-label output.
 
38
 
39
  ## Links
40
 
41
+ | Model | Equivalent [`timm`](https://huggingface.co/docs/timm/en/index) model name | S1 only | S2 only | S1+S2 |
42
  |:-----------------|:---------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------:|
43
  | ConvMixer-768/32 | `convmixer_768_32` | [ConvMixer-768/32 S1](https://huggingface.co/BIFOLD-BigEarthNetv2-0/convmixer_768_32-s1-v0.1.1) | [ConvMixer-768/32 S2](https://huggingface.co/BIFOLD-BigEarthNetv2-0/convmixer_768_32-s2-v0.1.1) | [ConvMixer-768/32 S1+S2](https://huggingface.co/BIFOLD-BigEarthNetv2-0/convmixer_768_32-all-v0.1.1) |
44
  | ConvNext v2 Base | `convnextv2_base` | [ConvNext v2 Base S1](https://huggingface.co/BIFOLD-BigEarthNetv2-0/convnextv2_base-s1-v0.1.1) | [ConvNext v2 Base S2](https://huggingface.co/BIFOLD-BigEarthNetv2-0/convnextv2_base-s2-v0.1.1) | [ConvNext v2 Base S1+S2](https://huggingface.co/BIFOLD-BigEarthNetv2-0/convnextv2_base-all-v0.1.1) |
 
54
  ## Usage
55
 
56
  To use the model, download the codes that define the model architecture from the
57
+ [official BigEarthNet v2.0 (reBEN) repository](https://git.tu-berlin.de/rsim/reben-training-scripts) and load the model with the corresponding weights
58
  using the code below. Note that [`configilm`](https://pypi.org/project/configilm/) is a requirement to use the
59
  code below.
60