carlosgomes98
commited on
Commit
•
d3758e2
1
Parent(s):
3cd7cc3
Add link to paper and citation
Browse files
README.md
CHANGED
@@ -1,112 +1,117 @@
|
|
1 |
-
---
|
2 |
-
license: apache-2.0
|
3 |
-
---
|
4 |
-
|
5 |
-
# Model Card for granite-geospatial-biomass
|
6 |
-
|
7 |
-
<p align="center" width="100%">
|
8 |
-
<img src="https://github.com/ibm-granite/granite-geospatial-biomass/blob/main/biomass-image.jpeg?raw=true" width="600">
|
9 |
-
</p>
|
10 |
-
|
11 |
-
The granite-geospatial-biomass model is a fine-tuned geospatial foundation model for predicting the total above ground biomass (i.e., living and dead plant material on the Earth's surface) using optical satellite imagery.
|
12 |
-
Above ground biomass is an important component of the carbon cycle and is crucial for estimating crop yields, monitoring forest timber production, and quantifying the carbon sequestered by nature-based actions.
|
13 |
-
|
14 |
-
The model predicts above ground biomass from the Harmonized Landsat and Sentinel-2 (HLS) L30 optical satellite imagery and is fine-tuned using training labels from the
|
15 |
-
Global Ecosystem Dynamics Investigation (GEDI) L4A. Uniquely, the model has been fine-tuned using HLS and GEDI data collected from 15 biomes across the globe.
|
16 |
-
Please see [Model Description](Model Description) below for more details.
|
17 |
-
|
18 |
-
## How to Get Started with the Model
|
19 |
-
|
20 |
-
This model was trained using [Terratorch](https://github.com/IBM/terratorch).
|
21 |
-
|
22 |
-
We make the weights as well as the configuration file that defines it available.
|
23 |
-
|
24 |
-
You can use it easily with Terratorch through:
|
25 |
-
|
26 |
-
```python
|
27 |
-
from terratorch.cli_tools import LightningInferenceModel
|
28 |
-
|
29 |
-
ckpt_path = hf_hub_download(repo_id="ibm-granite/granite-geospatial-biomass", filename="biomass_model.ckpt")
|
30 |
-
config_path = hf_hub_download(repo_id="ibm-granite/granite-geospatial-biomass", filename="config.yaml")
|
31 |
-
|
32 |
-
model = LightningInferenceModel.from_config(config_path, ckpt_path)
|
33 |
-
|
34 |
-
inference_results, input_file_names = model.inference_on_dir(<input_directory>)
|
35 |
-
```
|
36 |
-
|
37 |
-
For more details, check out the [Getting Started Notebook](https://github.com/ibm-granite/granite-geospatial-biomass/blob/main/notebooks/agb_getting_started.ipynb) which guides the user through three experiments:
|
38 |
-
|
39 |
-
1. Zero-shot for all biomes
|
40 |
-
2. Zero-shot for a single biome
|
41 |
-
3. Few-shot for a single biome
|
42 |
-
|
43 |
-
## Model Description
|
44 |
-
|
45 |
-
The granite-geospatial-biomass model is a geospatial foundation model that has been fine-tuned using HLS and GEDI data to perform regression.
|
46 |
-
|
47 |
-
The base foundation model from which the granite-geospatial-biomass model is fine-tuned is similar to that described in this [paper](https://arxiv.org/abs/2310.18660),
|
48 |
-
with the exception that the backbone is a Swin-B transformer. We opted for the Swin-B backbone instead of the ViT in the original paper because the Swin-B provides the following advantages:
|
49 |
-
- a smaller starting patch size which provides a higher effective resolution
|
50 |
-
- windowed attention which provides better computational efficiency
|
51 |
-
- hierarchical merging which provides a useful inductive bias
|
52 |
-
|
53 |
-
The base foundation model was pretrained using SimMIM, a self-supervised learning strategy based on masking large parts of the input (HLS2) data which are then reconstructed by the model. A small decoder composed of a single convolutional layer and a Pixel Shuffle module was added to the Swin-B backbone for the (pretraining) reconstruction task.
|
54 |
-
|
55 |
-
For fine-tuning, we replaced the small decoder with a UPerNet adapted for pixel-wise regression. We opted for the UPerNet because it provides fusion between transformer blocks, a similar intuition to the Unet which is consistently considered state-of-the-art for regression tasks with earth observation data. As the standard UPerNet implementation using the Swin-B backbone predicts a final feature map 4x smaller than the input, we appended two Pixel Shuffle layers to learn the upscaling. More details on the fine-tuned model can be found in this [paper](https://arxiv.org/abs/
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
- **
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
-
|
80 |
-
|
81 |
-
|
82 |
-
|
83 |
-
|
84 |
-
|
85 |
-
|
86 |
-
|
87 |
-
|
88 |
-
|
89 |
-
|
90 |
-
|
91 |
-
|
92 |
-
|
93 |
-
|
94 |
-
}
|
95 |
-
|
96 |
-
|
97 |
-
|
98 |
-
|
99 |
-
|
100 |
-
|
101 |
-
|
102 |
-
|
103 |
-
|
104 |
-
|
105 |
-
|
106 |
-
##
|
107 |
-
|
108 |
-
|
109 |
-
|
110 |
-
|
111 |
-
|
112 |
-
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
---
|
4 |
+
|
5 |
+
# Model Card for granite-geospatial-biomass
|
6 |
+
|
7 |
+
<p align="center" width="100%">
|
8 |
+
<img src="https://github.com/ibm-granite/granite-geospatial-biomass/blob/main/biomass-image.jpeg?raw=true" width="600">
|
9 |
+
</p>
|
10 |
+
|
11 |
+
The granite-geospatial-biomass model is a fine-tuned geospatial foundation model for predicting the total above ground biomass (i.e., living and dead plant material on the Earth's surface) using optical satellite imagery.
|
12 |
+
Above ground biomass is an important component of the carbon cycle and is crucial for estimating crop yields, monitoring forest timber production, and quantifying the carbon sequestered by nature-based actions.
|
13 |
+
|
14 |
+
The model predicts above ground biomass from the Harmonized Landsat and Sentinel-2 (HLS) L30 optical satellite imagery and is fine-tuned using training labels from the
|
15 |
+
Global Ecosystem Dynamics Investigation (GEDI) L4A. Uniquely, the model has been fine-tuned using HLS and GEDI data collected from 15 biomes across the globe.
|
16 |
+
Please see [Model Description](Model Description) below for more details.
|
17 |
+
|
18 |
+
## How to Get Started with the Model
|
19 |
+
|
20 |
+
This model was trained using [Terratorch](https://github.com/IBM/terratorch).
|
21 |
+
|
22 |
+
We make the weights as well as the configuration file that defines it available.
|
23 |
+
|
24 |
+
You can use it easily with Terratorch through:
|
25 |
+
|
26 |
+
```python
|
27 |
+
from terratorch.cli_tools import LightningInferenceModel
|
28 |
+
|
29 |
+
ckpt_path = hf_hub_download(repo_id="ibm-granite/granite-geospatial-biomass", filename="biomass_model.ckpt")
|
30 |
+
config_path = hf_hub_download(repo_id="ibm-granite/granite-geospatial-biomass", filename="config.yaml")
|
31 |
+
|
32 |
+
model = LightningInferenceModel.from_config(config_path, ckpt_path)
|
33 |
+
|
34 |
+
inference_results, input_file_names = model.inference_on_dir(<input_directory>)
|
35 |
+
```
|
36 |
+
|
37 |
+
For more details, check out the [Getting Started Notebook](https://github.com/ibm-granite/granite-geospatial-biomass/blob/main/notebooks/agb_getting_started.ipynb) which guides the user through three experiments:
|
38 |
+
|
39 |
+
1. Zero-shot for all biomes
|
40 |
+
2. Zero-shot for a single biome
|
41 |
+
3. Few-shot for a single biome
|
42 |
+
|
43 |
+
## Model Description
|
44 |
+
|
45 |
+
The granite-geospatial-biomass model is a geospatial foundation model that has been fine-tuned using HLS and GEDI data to perform regression.
|
46 |
+
|
47 |
+
The base foundation model from which the granite-geospatial-biomass model is fine-tuned is similar to that described in this [paper](https://arxiv.org/abs/2310.18660),
|
48 |
+
with the exception that the backbone is a Swin-B transformer. We opted for the Swin-B backbone instead of the ViT in the original paper because the Swin-B provides the following advantages:
|
49 |
+
- a smaller starting patch size which provides a higher effective resolution
|
50 |
+
- windowed attention which provides better computational efficiency
|
51 |
+
- hierarchical merging which provides a useful inductive bias
|
52 |
+
|
53 |
+
The base foundation model was pretrained using SimMIM, a self-supervised learning strategy based on masking large parts of the input (HLS2) data which are then reconstructed by the model. A small decoder composed of a single convolutional layer and a Pixel Shuffle module was added to the Swin-B backbone for the (pretraining) reconstruction task.
|
54 |
+
|
55 |
+
For fine-tuning, we replaced the small decoder with a UPerNet adapted for pixel-wise regression. We opted for the UPerNet because it provides fusion between transformer blocks, a similar intuition to the Unet which is consistently considered state-of-the-art for regression tasks with earth observation data. As the standard UPerNet implementation using the Swin-B backbone predicts a final feature map 4x smaller than the input, we appended two Pixel Shuffle layers to learn the upscaling. More details on the fine-tuned model can be found in this [paper](https://arxiv.org/abs/2406.19888).
|
56 |
+
|
57 |
+
|
58 |
+
## Model Releases (along with the branch name where the models are stored):
|
59 |
+
|
60 |
+
- **tag v1 —** - 28/07/2024
|
61 |
+
|
62 |
+
- Stay tuned for more models!
|
63 |
+
|
64 |
+
### Model Sources
|
65 |
+
|
66 |
+
- **Repository:** https://github.com/ibm-granite/granite-geospatial-biomass/
|
67 |
+
- **Paper (biomass):** https://arxiv.org/abs/2406.19888
|
68 |
+
- **Paper (foundation model):** https://arxiv.org/abs/2310.18660
|
69 |
+
|
70 |
+
### External Blogs
|
71 |
+
- https://research.ibm.com/blog/img-geospatial-studio-think
|
72 |
+
|
73 |
+
## Training Data
|
74 |
+
|
75 |
+
The model was trained on a collection of datasets provided by NASA:
|
76 |
+
- Harmonized Landsat-Sentinel 2 (HLS) L30: https://lpdaac.usgs.gov/products/hlss30v002/
|
77 |
+
- Global Ecosystem Dynamics Investigation (GEDI) L4A: https://doi.org/10.3334/ORNLDAAC/1907
|
78 |
+
|
79 |
+
For training and testing, the model requires a cloud-free snapshot of an area where all pixels are representative of the spectral bands for that location. The approach we used to create the cloud free images was to acquire HLS data during the leaf-on season for each hemisphere, analyze the timeseries, and select pixels that are not contaminated with clouds. We compute the mean value of each cloud-free pixel during the leaf-on season for each spectral band which is then assembled into a composite image representative for that area. The corresponding GEDI L4A biomass data obtained made during the same leaf-on season are interpolated to the HLS grid (CRS:4326) such that the measured biomass points are aligned with HLS data. GEDI data is spatially and temporaly sparse so pixels with no corresponding GEDI measurement are filled with a no data value.
|
80 |
+
|
81 |
+
|
82 |
+
|
83 |
+
## Citation [optional]
|
84 |
+
Kindly cite the following paper, if you intend to use our model or its associated architectures/approaches in your
|
85 |
+
work
|
86 |
+
|
87 |
+
**BibTeX:**
|
88 |
+
|
89 |
+
```
|
90 |
+
@misc{muszynski2024finetuninggeospatialfoundationmodels,
|
91 |
+
title={Fine-tuning of Geospatial Foundation Models for Aboveground Biomass Estimation},
|
92 |
+
author={Michal Muszynski and Levente Klein and Ademir Ferreira da Silva and Anjani Prasad Atluri and Carlos Gomes and Daniela Szwarcman and Gurkanwar Singh and Kewen Gu and Maciel Zortea and Naomi Simumba and Paolo Fraccaro and Shraddha Singh and Steve Meliksetian and Campbell Watson and Daiki Kimura and Harini Srinivasan},
|
93 |
+
year={2024},
|
94 |
+
url={https://arxiv.org/abs/2406.19888},
|
95 |
+
}
|
96 |
+
```
|
97 |
+
|
98 |
+
**APA:**
|
99 |
+
|
100 |
+
```
|
101 |
+
Muszynski, M., Klein, L., da Silva, A. F., Atluri, A. P., Gomes, C., Szwarcman, D., Singh, G., Gu, K.,
|
102 |
+
Zortea, M., Simumba, N., Fraccaro, P., Singh, S., Meliksetian, S., Watson, C., Kimura, D., & Srinivasan, H. (2024).
|
103 |
+
Fine-tuning of geospatial foundation models for aboveground biomass estimation. arXiv. https://arxiv.org/abs/2406.19888
|
104 |
+
```
|
105 |
+
|
106 |
+
## Model Card Authors
|
107 |
+
|
108 |
+
Julian Kuehnert, Levente Klein, Catherine Wanjiru, Carlos Gomes and Campbell Watson
|
109 |
+
|
110 |
+
|
111 |
+
## IBM Public Repository Disclosure:
|
112 |
+
|
113 |
+
All content in this repository including code has been provided by IBM under the associated
|
114 |
+
open source software license and IBM is under no obligation to provide enhancements,
|
115 |
+
updates, or support. IBM developers produced this code as an
|
116 |
+
open source project (not as an IBM product), and IBM makes no assertions as to
|
117 |
+
the level of quality nor security, and will not be maintaining this code going forward.
|