Update README.md
Browse files
README.md
CHANGED
@@ -21,8 +21,14 @@ tags:
|
|
21 |
|
22 |
# Time-Series Transformer (Timer)
|
23 |
|
24 |
-
**Update** (2025.5): We release a generative time series foundation model [
|
25 |
|
|
|
|
|
|
|
|
|
|
|
|
|
26 |
Large time-series model introduced in this [paper](https://arxiv.org/abs/2402.02368) and enhanced with our [further work](https://arxiv.org/abs/2410.04803).
|
27 |
|
28 |

|
|
|
21 |
|
22 |
# Time-Series Transformer (Timer)
|
23 |
|
24 |
+
**Update** (2025.5): We release a generative time series foundation model [Sundial](https://arxiv.org/abs/2502.00816) on [HuggingFace](https://huggingface.co/thuml/sundial-base-128m).
|
25 |
|
26 |
+
**Update** (2025.2) We release an open codebase [OpenLTM](https://github.com/thuml/OpenLTM) for pre-training/fine-tuning customized large time-series models.
|
27 |
+
|
28 |
+
**Update** (2024.12): [Timer-XL](https://arxiv.org/abs/2410.04803) for unified multi-dimensional time series forecasting is accepted as ICLR 2025.
|
29 |
+
|
30 |
+
**Update** (2024.5) [Timer](https://arxiv.org/abs/2402.02368), a large-scale pre-trained time sereis Transformer is accepted by ICML 2024.
|
31 |
+
|
32 |
Large time-series model introduced in this [paper](https://arxiv.org/abs/2402.02368) and enhanced with our [further work](https://arxiv.org/abs/2410.04803).
|
33 |
|
34 |

|