File size: 4,140 Bytes
8e10bd6
 
cd3ffcf
0a3a9b7
 
cd3ffcf
0a3a9b7
cd3ffcf
0a3a9b7
8e10bd6
e14215f
93c4194
42cd049
93c4194
859b5d7
 
 
93c4194
 
 
 
 
 
 
 
 
 
 
2b1eb33
93c4194
42cd049
 
93c4194
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0a3a9b7
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
---
license: cc-by-nc-sa-4.0
tags:
- weather-forecasting
- climate
language:
- en
pretty_name: DeepMind GraphCast Small
pipeline_tag: graph-ml
---
# DeepMind GraphCast Small

This repo contains the weights for `GraphCast_small`, a smaller, low-resolution version of GraphCast (1 degree resolution, 13 pressure levels, and a smaller mesh), trained on ERA5 data from 1979 to 2015, useful to run a model with lower memory and compute constraints.

- Original repo: https://github.com/google-deepmind/graphcast
- Original files are from this Google Cloud Bucket: https://console.cloud.google.com/storage/browser/dm_graphcast

## License and Attribution
The model weights are released by Google DeepMind.

The model weights are made available for use under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0). You may obtain a copy of the License at: https://creativecommons.org/licenses/by-nc-sa/4.0/.

## Usage

You can load the model like so:

```python
from graphcast import checkpoint
from huggingface_hub import hf_hub_download

REPO_ID = "shermansiu/dm_graphcast_small"
FILENAME = "GraphCast_small - ERA5 1979-2015 - resolution 1.0 - pressure levels 13 - mesh 2to5 - precipitation input and output.npz"

with open(hf_hub_download(repo_id=REPO_ID, filename=FILENAME), "rb") as f:
    ckpt = checkpoint.load(f, graphcast.CheckPoint)
params = ckpt.params
state = {}

model_config = ckpt.model_config
task_config = ckpt.task_config
```

For more details, check out https://github.com/shermansiu/graphcast/blob/main/graphcast_demo_hf.ipynb

## Citation
- Paper: https://www.science.org/doi/10.1126/science.adi2336
- Preprint: https://arxiv.org/abs/2212.12794

```
@article{
doi:10.1126/science.adi2336,
author = {Remi Lam  and Alvaro Sanchez-Gonzalez  and Matthew Willson  and Peter Wirnsberger  and Meire Fortunato  and Ferran Alet  and Suman Ravuri  and Timo Ewalds  and Zach Eaton-Rosen  and Weihua Hu  and Alexander Merose  and Stephan Hoyer  and George Holland  and Oriol Vinyals  and Jacklynn Stott  and Alexander Pritzel  and Shakir Mohamed  and Peter Battaglia },
title = {Learning skillful medium-range global weather forecasting},
journal = {Science},
volume = {382},
number = {6677},
pages = {1416-1421},
year = {2023},
doi = {10.1126/science.adi2336},
URL = {https://www.science.org/doi/abs/10.1126/science.adi2336},
eprint = {https://www.science.org/doi/pdf/10.1126/science.adi2336},
abstract = {Global medium-range weather forecasting is critical to decision-making across many social and economic domains. Traditional numerical weather prediction uses increased compute resources to improve forecast accuracy but does not directly use historical weather data to improve the underlying model. Here, we introduce GraphCast, a machine learning–based method trained directly from reanalysis data. It predicts hundreds of weather variables for the next 10 days at 0.25° resolution globally in under 1 minute. GraphCast significantly outperforms the most accurate operational deterministic systems on 90\% of 1380 verification targets, and its forecasts support better severe event prediction, including tropical cyclone tracking, atmospheric rivers, and extreme temperatures. GraphCast is a key advance in accurate and efficient weather forecasting and helps realize the promise of machine learning for modeling complex dynamical systems. The numerical models used to predict weather are large, complex, and computationally demanding and do not learn from past weather patterns. Lam et al. introduced a machine learning–based method that has been trained directly from reanalysis data of past atmospheric conditions. In this way, the authors were able to quickly predict hundreds of weather variables globally up to 10 days in advance and at high resolution. Their predictions were more accurate than those of traditional weather models in 90\% of tested cases and displayed better severe event prediction for tropical cyclones, atmospheric rivers, and extreme temperatures. —H. Jesse Smith Machine learning leads to better, faster, and cheaper weather forecasting.}}
```