Sem-RemmmBERT / README.md
robvanderg's picture
Create README.md
3aaf8a8
|
raw
history blame
645 Bytes
metadata
language:
  - multilingual
tags:
  - STILT
  - retraining
  - multi-task learning
datasets:
  - SemEval 2022

Sem-RemmmBERT

This is the SemEval MaChAmp Multitask Multilingual BERT model. This model is retrained from remBERT (https://huggingface.co/google/rembertased).

The retraining is done based on all SemEval 2022 tasks that are text based, and have annotation on the word, sentence or paragraph level. The retraining is done with MaChAmp (https://machamp-nlp.github.io/), a toolkit focusing on multi-task learning for NLP. More information can be found in the paper (which should be released when the SemEval proceedings are online).