Edit model card

KoBERT

How to use

If you want to import KoBERT tokenizer with AutoTokenizer, you should give trust_remote_code=True.

from transformers import AutoModel, AutoTokenizer

model = AutoModel.from_pretrained("monologg/kobert")
tokenizer = AutoTokenizer.from_pretrained("monologg/kobert", trust_remote_code=True)

Reference

Downloads last month
145,722
Safetensors
Model size
92.2M params
Tensor type
F32
Β·
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for monologg/kobert

Finetunes
6 models
Quantizations
1 model

Spaces using monologg/kobert 4