Pieter Delobelle commited on
Commit
8cb7a69
1 Parent(s): cfe4fab

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +14 -2
README.md CHANGED
@@ -16,7 +16,7 @@ model-index:
16
  metrics:
17
  - name: Accuracy
18
  type: accuracy
19
- value: 0.93325
20
  widget:
21
  - text: "Ik erken dat dit een boek is, daarmee is alles gezegd."
22
  - text: "Prachtig verhaal, heel mooi verteld en een verrassend einde... Een topper!"
@@ -34,7 +34,7 @@ tags:
34
 
35
  # RobBERT finetuned for sentiment analysis on DBRD
36
 
37
- This is a finetuned model based on [RobBERT (v2)](https://huggingface.co/pdelobelle/robbert-v2-dutch-base). We used [DBRD](https://huggingface.co/datasets/dbrd), which consists of book reviews from [hebban.nl](hebban.nl). Hence our example sentences about books. We did some limited experiments to test if this also works for other domains, but this was not
38
 
39
  # Training data and setup
40
  We used the [Dutch Book Reviews Dataset (DBRD)](https://huggingface.co/datasets/dbrd) from van der Burgh et al. (2019).
@@ -54,6 +54,18 @@ This project is created by [Pieter Delobelle](https://people.cs.kuleuven.be/~pie
54
  If you would like to cite our paper or models, you can use the following BibTeX:
55
 
56
  ```
 
 
 
 
 
 
 
 
 
 
 
 
57
  @inproceedings{delobelle2020robbert,
58
  title = "{R}ob{BERT}: a {D}utch {R}o{BERT}a-based {L}anguage {M}odel",
59
  author = "Delobelle, Pieter and
 
16
  metrics:
17
  - name: Accuracy
18
  type: accuracy
19
+ value: 0.9294064748201439
20
  widget:
21
  - text: "Ik erken dat dit een boek is, daarmee is alles gezegd."
22
  - text: "Prachtig verhaal, heel mooi verteld en een verrassend einde... Een topper!"
 
34
 
35
  # RobBERT finetuned for sentiment analysis on DBRD
36
 
37
+ This is a finetuned model based on [RobBERTje (non-shuffled)](https://huggingface.co/DTAI-KULeuven/robbertje-1-gb-non-shuffled). We used [DBRD](https://huggingface.co/datasets/dbrd), which consists of book reviews from [hebban.nl](hebban.nl). Hence our example sentences about books. We did some limited experiments to test if this also works for other domains, but this was not
38
 
39
  # Training data and setup
40
  We used the [Dutch Book Reviews Dataset (DBRD)](https://huggingface.co/datasets/dbrd) from van der Burgh et al. (2019).
 
54
  If you would like to cite our paper or models, you can use the following BibTeX:
55
 
56
  ```
57
+ @article{Delobelle_Winters_Berendt_2021,
58
+ title = {RobBERTje: A Distilled Dutch BERT Model},
59
+ author = {Delobelle, Pieter and Winters, Thomas and Berendt, Bettina},
60
+ year = 2021,
61
+ month = {Dec.},
62
+ journal = {Computational Linguistics in the Netherlands Journal},
63
+ volume = 11,
64
+ pages = {125–140},
65
+ url = {https://www.clinjournal.org/clinj/article/view/131}
66
+ }
67
+
68
+
69
  @inproceedings{delobelle2020robbert,
70
  title = "{R}ob{BERT}: a {D}utch {R}o{BERT}a-based {L}anguage {M}odel",
71
  author = "Delobelle, Pieter and