National Library of Sweden / KBLab
non-profit
AI & ML interests
NLP, language models, acoustic models, NER
Organization Card
KBLab is a national research infrastructure for digital humanities and social science at the National Library of Sweden. We train large language models and acoustic models on Swedish data from the librarys collections. Check out our blog!
Some of our most popular models are
- the original KB-BERT
- a large BERT trained with Megatron-LM
- a Sentence-BERT
- an NER model trained on a mix of cased and uncased data
- a wav2vec model
- a BART model
Additionally we provide previous checkpoints for some models and other models closely related to those above. Models related to each other are linked in the respective model cards.
If you are unsure which model fits you best feel free to contact us.
We also provide a variant of the venerable SUC 3.0 NER-dataset extended by Språkbanken-Text dubbed SUCX 3.0 - NER and our own Swedish/Norwegian/Danish GLUE-imitation dubbed ÖverLim.
models
71
KBLab/sentence-bert-swedish-cased
Sentence Similarity
•
Updated
•
5.71k
•
25
KBLab/swedish-ocr-correction
Updated
•
17
KBLab/megatron.bert-large.wordpiece-32k-pretok.25k-steps
Feature Extraction
•
Updated
•
3
KBLab/megatron.bert-large.wordpiece-64k-pretok.25k-steps
Feature Extraction
•
Updated
•
2
KBLab/megatron.bert-large.unigram-64k-pretok.25k-steps
Feature Extraction
•
Updated
•
3
KBLab/megatron.bert-large.unigram-32k-pretok.25k-steps
Feature Extraction
•
Updated
•
4
KBLab/megatron.bert-large.spe-bpe-32k-pretok.25k-steps
Feature Extraction
•
Updated
•
3
KBLab/megatron.bert-large.bpe-64k-no_pretok.25k-steps
Feature Extraction
•
Updated
•
3
KBLab/megatron.bert-base.wordpiece-64k-pretok.25k-steps
Feature Extraction
•
Updated
•
2
KBLab/megatron.bert-base.wordpiece-64k-no_pretok.25k-steps
Feature Extraction
•
Updated
•
2