tokenizer.model.v3 / params.json
genaidesign's picture
Upload 3 files
8c8f3ec verified
raw
history blame contribute delete
202 Bytes
{
"dim": 4096,
"n_layers": 32,
"head_dim": 128,
"hidden_dim": 14336,
"n_heads": 32,
"n_kv_heads": 8,
"norm_eps": 1e-05,
"vocab_size": 32768,
"rope_theta": 1000000.0
}