Update README.md

#1
by GPaolo - opened
Files changed (1) hide show
  1. README.md +5 -0
README.md CHANGED
@@ -11,7 +11,12 @@ pinned: false
11
 
12
  ## Projects
13
 
 
 
14
  - [Large Language Models as Markov Chains](https://huggingface.co/papers/2410.02724): theoretical insights on their generalization and convergence properties.
 
 
 
15
  - *(NeurIPS'24)* [MANO: Unsupervised Accuracy Estimation Under Distribution Shifts](https://huggingface.co/papers/2405.18979): when logits are enough to estimate generalization of a pre-trained model.
16
  - *(NeurIPS'24, **Spotlight**)* [Analysing Multi-Task Regression via Random Matrix Theory](https://arxiv.org/pdf/2406.10327): insights on a classical approach and its potentiality for time series forecasting.
17
  - *(ICML'24, **Oral**)* [SAMformer: Unlocking the Potential of Transformers in Time Series Forecasting](https://huggingface.co/papers/2402.10198): sharpness-aware minimization and channel-wise attention is all you need.
 
11
 
12
  ## Projects
13
 
14
+ ### Preprints
15
+
16
  - [Large Language Models as Markov Chains](https://huggingface.co/papers/2410.02724): theoretical insights on their generalization and convergence properties.
17
+
18
+ ### 2024
19
+
20
  - *(NeurIPS'24)* [MANO: Unsupervised Accuracy Estimation Under Distribution Shifts](https://huggingface.co/papers/2405.18979): when logits are enough to estimate generalization of a pre-trained model.
21
  - *(NeurIPS'24, **Spotlight**)* [Analysing Multi-Task Regression via Random Matrix Theory](https://arxiv.org/pdf/2406.10327): insights on a classical approach and its potentiality for time series forecasting.
22
  - *(ICML'24, **Oral**)* [SAMformer: Unlocking the Potential of Transformers in Time Series Forecasting](https://huggingface.co/papers/2402.10198): sharpness-aware minimization and channel-wise attention is all you need.