tau
/

Transformers
PyTorch
English
tau/sled
Inference Endpoints
File size: 112 Bytes
d01ed1c
 
 
 
 
1
2
3
4
5
{
  "tokenizer_class": "SledTokenizer",
  "base_tokenizer": "facebook/bart-large",
  "model_max_length": 16384
}