Sem-mmmBERT / README.md
robvanderg's picture
Update README.md
4229c7a
|
raw
history blame
651 Bytes
metadata
language:
  - multilingual
tags:
  - STILT
  - retraining
  - multi-task learning
datasets:
  - SemEval 2022

Sem-mmmBERT

This is the SemEval MaChAmp Multitask Multilingual BERT model. This model is retrained from mBERT (https://huggingface.co/bert-base-multilingual-cased).

The retraining is done based on all SemEval 2022 tasks that are text based, and have annotation on the word, sentence or paragraph level. The retraining is done with MaChAmp (https://machamp-nlp.github.io/), a toolkit focusing on multi-task learning for NLP. More information can be found in the paper (which should be released when the SemEval proceedings are online).