Update README.md
Browse files
README.md
CHANGED
@@ -41,3 +41,20 @@ The model was trained on the data shown in the table below. Batch size was 8.8k,
|
|
41 |
Note: At an earlier date a half trained model went up here, it has since been removed. The model has since been updated.
|
42 |
|
43 |
This is a Scandinavian BERT model trained on a large collection of Danish, Faroese, Icelandic, Norwegian and Swedish text. It is currently the highest ranking model on the ScandEval leaderbord https://scandeval.github.io/pretrained/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
41 |
Note: At an earlier date a half trained model went up here, it has since been removed. The model has since been updated.
|
42 |
|
43 |
This is a Scandinavian BERT model trained on a large collection of Danish, Faroese, Icelandic, Norwegian and Swedish text. It is currently the highest ranking model on the ScandEval leaderbord https://scandeval.github.io/pretrained/
|
44 |
+
|
45 |
+
If you find this model useful, please cite
|
46 |
+
|
47 |
+
```
|
48 |
+
@inproceedings{snaebjarnarson-etal-2023-transfer,
|
49 |
+
title = "{T}ransfer to a Low-Resource Language via Close Relatives: The Case Study on Faroese",
|
50 |
+
author = "Snæbjarnarson, Vésteinn and
|
51 |
+
Simonsen, Annika and
|
52 |
+
Glavaš, Goran and
|
53 |
+
Vulić, Ivan",
|
54 |
+
booktitle = "Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa)",
|
55 |
+
month = "may 22--24",
|
56 |
+
year = "2023",
|
57 |
+
address = "Tórshavn, Faroe Islands",
|
58 |
+
publisher = {Link{\"o}ping University Electronic Press, Sweden},
|
59 |
+
}
|
60 |
+
```
|