File size: 651 Bytes
a7f75dc 4229c7a a7f75dc 5b6bf74 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
---
language:
- multilingual
tags:
- STILT
- retraining
- multi-task learning
datasets:
- SemEval 2022
---
## Sem-mmmBERT
This is the SemEval MaChAmp Multitask Multilingual BERT model. This model is retrained from mBERT (https://huggingface.co/bert-base-multilingual-cased).
The retraining is done based on all SemEval 2022 tasks that are text based, and have annotation on the word, sentence or paragraph level. The retraining is done with MaChAmp (https://machamp-nlp.github.io/), a toolkit focusing on multi-task learning for NLP. More information can be found in the paper (which should be released when the SemEval proceedings are online). |