robinq's picture
Update README.md
07fae38
|
raw
history blame
839 Bytes
metadata
model:
  - KB/bert-base-swedish-cased
tags:
  - token-classification
  - sequence-tagger-model
  - bert
language: sv
datasets:
  - KBLab/suc3_1
widget:
  - text: Emil bor i Lönneberga

KB-BERT for NER

Cased data

This model is based on KB-BERT and was fine-tuned on the SUC 3.1 corpus, using the simple tags and cased data. For this model we used a variation of the data that did not use BIO-encoding to differentiate between the beginnings (B), and insides (I) of named entity tags.

The model was trained on the training data only, with the best model chosen by its performance on the validation data. You find more information about the model and the performance on our blog: https://kb-labb.github.io/posts/2022-02-07-suc31