hli's picture
Add new SentenceTransformer model.
8c58910
raw
history blame
164 Bytes
{
"tokenizer_class": "sentence_transformers.models.tokenizer.WhitespaceTokenizer.WhitespaceTokenizer",
"update_embeddings": false,
"max_seq_length": 1000000
}