ruBert-large / README.md
ai-forever's picture
Update README.md
419a854
|
raw
history blame
471 Bytes
---
language:
- ru
tags:
- PyTorch
- Transformers
- bert
- exbert
thumbnail: "https://github.com/sberbank-ai/model-zoo"
pipeline_tag: fill-mask
---
# ruBert-large
Model was trained by [SberDevices](https://sberdevices.ru/) team.
* Task: `mask filling`
* Type: `encoder`
* Tokenizer: `bpe`
* Dict size: `120 138`
* Num Parameters: `427 M`
* Training Data Volume `30 GB`
# Authors
+ NLP core team RnD [Telegram channel](https://t.me/nlpcoreteam):
+ Dmitry Zmitrovich