ai-forever vmkhlv commited on
Commit
3ad43d0
1 Parent(s): 5b4be94

Update README.md (#3)

Browse files

- Update README.md (3f337b85174e9c4fab07a313ef4c9934a1af2f83)


Co-authored-by: Vladislav Mikhailov <[email protected]>

Files changed (1) hide show
  1. README.md +16 -2
README.md CHANGED
@@ -5,8 +5,9 @@ license: apache-2.0
5
  ---
6
 
7
  # FRED-T5 large 820M (Full-scale Russian Enhanced Denoisers T5)
 
8
 
9
- Model was trained by [SberDevices](https://sberdevices.ru/).
10
 
11
  Architecture based on T5.
12
 
@@ -68,4 +69,17 @@ print(tokenizer.decode(outputs[0][1:]))
68
  + Mikhail Novikov
69
  + Alexey Khoroshilov
70
 
71
- [Salute AI Community](https://t.me/SaluteTechGroup).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5
  ---
6
 
7
  # FRED-T5 large 820M (Full-scale Russian Enhanced Denoisers T5)
8
+ The model architecture design, pretraining, and evaluation are documented in our preprint: [**A Family of Pretrained Transformer Language Models for Russian**](https://arxiv.org/abs/2309.10931).
9
 
10
+ The model was trained by [SberDevices](https://sberdevices.ru/).
11
 
12
  Architecture based on T5.
13
 
 
69
  + Mikhail Novikov
70
  + Alexey Khoroshilov
71
 
72
+ [Salute AI Community](https://t.me/SaluteTechGroup).
73
+
74
+
75
+ # Cite us
76
+ ```
77
+ @misc{zmitrovich2023family,
78
+ title={A Family of Pretrained Transformer Language Models for Russian},
79
+ author={Dmitry Zmitrovich and Alexander Abramov and Andrey Kalmykov and Maria Tikhonova and Ekaterina Taktasheva and Danil Astafurov and Mark Baushenko and Artem Snegirev and Tatiana Shavrina and Sergey Markov and Vladislav Mikhailov and Alena Fenogenova},
80
+ year={2023},
81
+ eprint={2309.10931},
82
+ archivePrefix={arXiv},
83
+ primaryClass={cs.CL}
84
+ }
85
+ ```