File size: 2,072 Bytes
52ec00b
 
4207411
 
 
 
 
 
 
 
 
 
 
 
 
 
 
52ec00b
4207411
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
---
license: apache-2.0
language:
- en
tags:
- Pytorch
- mmsegmentation
- segmentation
- Crop Classification
- Multi Temporal
- Geospatial
- Foundation model
datasets:
- ibm-nasa-geospatial/hls_burn_scars
metrics:
- accuracy
- IoU
---
### Model and Inputs
The pretrained [Prithvi-100m](https://huggingface.co/ibm-nasa-geospatial/Prithvi-100M/blob/main/README.md) parameter model is finetuned to classify crop and other land cover types based off HLS data and CDL labels from the [HLS Burn Scar Scenes dataset](https://huggingface.co/datasets/ibm-nasa-geospatial/hls_burn_scars). 
This dataset includes input chips of 224x224x18, where 224 is the height and width and 18 is combined with 6 bands of 3 time-steps. 
The bands are:
 
1. Blue
2. Green
3. Red
4. Narrow NIR
5. SWIR 1
6. SWIR 2

While the Prithvi-100m was pretrained with 3 timesteps, this task utilize the capibility of multi-temporal data input adapted from the pretrained foundation model and provide more generalized and

### Code
Code for Finetuning is available through [github](https://github.com/NASA-IMPACT/hls-foundation-os/tree/main/fine-tuning-examples)

Configuration used for finetuning is available through [config](https://github.com/NASA-IMPACT/hls-foundation-os/blob/main/fine-tuning-examples/configs/firescars_config.py
).

### Results
The experiment by running the mmseg stack for 80 epochs using the above config led to an IoU of **0.72** on the burn scar class and **0.96** overall accuracy. It is noteworthy that this leads to a resonably good model, but further developement will most likely improve performance.

### Inference and demo
There is an inference script that allows to run the hls-cdl crop classification model for inference on HLS images. These input have to be geotiff format, including 18 bands for 3 time-step, and each time-step includes the channels described above (Blue, Green, Red, Narrow NIR, SWIR, SWIR 2) in order. There is also a **demo** that leverages the same code **[here](https://huggingface.co/spaces/ibm-nasa-geospatial/Prithvi-100M-Burn-scars-demo)**.