Abstract
This technical report describes the Time Series Optimized Transformer for Observability (Toto), a new state of the art foundation model for time series forecasting developed by Datadog. In addition to advancing the state of the art on generalized time series benchmarks in domains such as electricity and weather, this model is the first general-purpose time series forecasting foundation model to be specifically tuned for observability metrics. Toto was trained on a dataset of one trillion time series data points, the largest among all currently published time series foundation models. Alongside publicly available time series datasets, 75% of the data used to train Toto consists of fully anonymous numerical metric data points from the Datadog platform. In our experiments, Toto outperforms existing time series foundation models on observability data. It does this while also excelling at general-purpose forecasting tasks, achieving state-of-the-art zero-shot performance on multiple open benchmark datasets.
Community
Datadog’s foundation model for time series matches or beats the state of the art for zero-shot forecasting on standard time series benchmark datasets. In addition, it significantly improves on the state of the art for forecasting observability metrics.
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- TimeCMA: Towards LLM-Empowered Time Series Forecasting via Cross-Modality Alignment (2024)
- SiamTST: A Novel Representation Learning Framework for Enhanced Multivariate Time Series Forecasting applied to Telco Networks (2024)
- Leveraging 2D Information for Long-term Time Series Forecasting with Vanilla Transformers (2024)
- Are Self-Attentions Effective for Time Series Forecasting? (2024)
- UniTST: Effectively Modeling Inter-Series and Intra-Series Dependencies for Multivariate Time Series Forecasting (2024)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Kudos
@Emaad
and team. I've featured this paper in my AI research newsletter https://www.aitidbits.ai/p/july-18th-2024
Looking forward to more novel papers and methods.
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper