Update README.md
Browse files
README.md
CHANGED
@@ -2,7 +2,12 @@
|
|
2 |
language:
|
3 |
- ru
|
4 |
- en
|
|
|
|
|
|
|
|
|
5 |
---
|
|
|
6 |
## EnRuDR-BERT
|
7 |
|
8 |
EnRuDR-BERT - Multilingual, Cased, which pretrained on the raw part of the RuDReC corpus (1.4M reviews) and english collection of consumer comments on drug administration from [2]. Pre-training was based on the [original BERT code](https://github.com/google-research/bert) provided by Google. In particular, Multi-BERT was for used for initialization; vocabulary of Russian subtokens and parameters are the same as in Multi-BERT. Training details are described in our paper. \
|
|
|
2 |
language:
|
3 |
- ru
|
4 |
- en
|
5 |
+
tags:
|
6 |
+
- bio
|
7 |
+
- med
|
8 |
+
- biomedical
|
9 |
---
|
10 |
+
|
11 |
## EnRuDR-BERT
|
12 |
|
13 |
EnRuDR-BERT - Multilingual, Cased, which pretrained on the raw part of the RuDReC corpus (1.4M reviews) and english collection of consumer comments on drug administration from [2]. Pre-training was based on the [original BERT code](https://github.com/google-research/bert) provided by Google. In particular, Multi-BERT was for used for initialization; vocabulary of Russian subtokens and parameters are the same as in Multi-BERT. Training details are described in our paper. \
|