l3cube-pune
commited on
Commit
•
56a10e6
1
Parent(s):
84f969c
Update README.md
Browse files
README.md
CHANGED
@@ -6,13 +6,14 @@ datasets:
|
|
6 |
---
|
7 |
|
8 |
## MahaBERT
|
|
|
|
|
|
|
9 |
MahaBERT is a Marathi BERT model. It is a multilingual BERT (bert-base-multilingual-cased) model fine-tuned on L3Cube-MahaCorpus and other publicly available Marathi monolingual datasets.
|
10 |
[dataset link] (https://github.com/l3cube-pune/MarathiNLP)
|
11 |
|
12 |
More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2202.01159)
|
13 |
|
14 |
-
New version of this model is available here: https://huggingface.co/l3cube-pune/marathi-bert-v2
|
15 |
-
|
16 |
```
|
17 |
@InProceedings{joshi:2022:WILDRE6,
|
18 |
author = {Joshi, Raviraj},
|
|
|
6 |
---
|
7 |
|
8 |
## MahaBERT
|
9 |
+
|
10 |
+
New version of this model is available here: https://huggingface.co/l3cube-pune/marathi-bert-v2
|
11 |
+
|
12 |
MahaBERT is a Marathi BERT model. It is a multilingual BERT (bert-base-multilingual-cased) model fine-tuned on L3Cube-MahaCorpus and other publicly available Marathi monolingual datasets.
|
13 |
[dataset link] (https://github.com/l3cube-pune/MarathiNLP)
|
14 |
|
15 |
More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2202.01159)
|
16 |
|
|
|
|
|
17 |
```
|
18 |
@InProceedings{joshi:2022:WILDRE6,
|
19 |
author = {Joshi, Raviraj},
|