michaelfeil's picture
Upload sentence-transformers/all-MiniLM-L6-v2 ctranslate fp16 weights
0626d2b
raw
history blame
103 Bytes
{
"bos_token": "<s>",
"eos_token": "</s>",
"layer_norm_epsilon": 1e-12,
"unk_token": "[UNK]"
}