Update README.md
Browse files
README.md
CHANGED
@@ -11,8 +11,9 @@ license: cdla-permissive-2.0
|
|
11 |
TinyTimeMixers (TTMs) are compact pre-trained models for Multivariate Time-Series Forecasting, open-sourced by IBM Research.
|
12 |
**With less than 1 Million parameters, TTM introduces the notion of the first-ever “tiny” pre-trained models for Time-Series Forecasting.**
|
13 |
|
14 |
-
TTM outperforms several popular benchmarks demanding billions of parameters in zero-shot and few-shot forecasting.
|
15 |
-
|
|
|
16 |
|
17 |
**The current open-source version supports point forecasting use-cases ranging from minutely to hourly resolutions
|
18 |
(Ex. 10 min, 15 min, 1 hour, etc.)**
|
@@ -23,7 +24,7 @@ can be easily fine-tuned on your multi-variate target data. Refer to our [paper]
|
|
23 |
## Benchmark Highlights:
|
24 |
|
25 |
- TTM (with less than 1 Million parameters) outperforms the following popular Pre-trained SOTAs demanding several hundred Million to Billions of parameters [paper](https://arxiv.org/pdf/2401.03955.pdf):
|
26 |
-
- *GPT4TS (NeurIPS 23) by 7-12% in few-shot forecasting
|
27 |
- *LLMTime (NeurIPS 23) by 24% in zero-shot forecasting*.
|
28 |
- *SimMTM (NeurIPS 23) by 17% in few-shot forecasting*.
|
29 |
- *Time-LLM (ICLR 24) by 2-8% in few-shot forecasting*
|
@@ -57,10 +58,12 @@ getting started [notebook](https://github.com/IBM/tsfm/blob/main/notebooks/hfdem
|
|
57 |
## Model Releases (along with the branch name where the models are stored):
|
58 |
|
59 |
- **512-96:** Given the last 512 time-points (i.e. context length), this model can forecast up to next 96 time-points (i.e. forecast length)
|
60 |
-
in future.
|
|
|
61 |
|
62 |
- **1024-96:** Given the last 1024 time-points (i.e. context length), this model can forecast up to next 96 time-points (i.e. forecast length)
|
63 |
-
in future.
|
|
|
64 |
|
65 |
- Stay tuned for more models !
|
66 |
|
|
|
11 |
TinyTimeMixers (TTMs) are compact pre-trained models for Multivariate Time-Series Forecasting, open-sourced by IBM Research.
|
12 |
**With less than 1 Million parameters, TTM introduces the notion of the first-ever “tiny” pre-trained models for Time-Series Forecasting.**
|
13 |
|
14 |
+
TTM outperforms several popular benchmarks demanding billions of parameters in zero-shot and few-shot forecasting. TTMs are lightweight
|
15 |
+
forecasters, pre-trained on publicly available time series data with various augmentations. TTM provides state-of-the-art zero-shot forecasts and can easily be
|
16 |
+
fine-tuned for multi-variate forecasts with just 5% of the training data to be competitive. Refer to our [paper](https://arxiv.org/pdf/2401.03955.pdf) for more details.
|
17 |
|
18 |
**The current open-source version supports point forecasting use-cases ranging from minutely to hourly resolutions
|
19 |
(Ex. 10 min, 15 min, 1 hour, etc.)**
|
|
|
24 |
## Benchmark Highlights:
|
25 |
|
26 |
- TTM (with less than 1 Million parameters) outperforms the following popular Pre-trained SOTAs demanding several hundred Million to Billions of parameters [paper](https://arxiv.org/pdf/2401.03955.pdf):
|
27 |
+
- *GPT4TS (NeurIPS 23) by 7-12% in few-shot forecasting*
|
28 |
- *LLMTime (NeurIPS 23) by 24% in zero-shot forecasting*.
|
29 |
- *SimMTM (NeurIPS 23) by 17% in few-shot forecasting*.
|
30 |
- *Time-LLM (ICLR 24) by 2-8% in few-shot forecasting*
|
|
|
58 |
## Model Releases (along with the branch name where the models are stored):
|
59 |
|
60 |
- **512-96:** Given the last 512 time-points (i.e. context length), this model can forecast up to next 96 time-points (i.e. forecast length)
|
61 |
+
in future. This model is targeted towards a forecasting setting of context length 512 and forecast length 96 and
|
62 |
+
recommended for hourly and minutely resolutions (Ex. 10 min, 15 min, 1 hour, etc). (branch name: main)
|
63 |
|
64 |
- **1024-96:** Given the last 1024 time-points (i.e. context length), this model can forecast up to next 96 time-points (i.e. forecast length)
|
65 |
+
in future. This model is targeted towards a long forecasting setting of context length 1024 and forecast length 96 and
|
66 |
+
recommended for hourly and minutely resolutions (Ex. 10 min, 15 min, 1 hour, etc). (branch name: 1024-96-v1)
|
67 |
|
68 |
- Stay tuned for more models !
|
69 |
|