q_sparse_Baseline / config.json
joey00072's picture
checkpoint iter: 2000
7476400 verified
raw
history blame contribute delete
250 Bytes
{
"activation": "relu_squared",
"bias": false,
"d_model": 512,
"dropout": 0.2,
"hidden_dim": 2048,
"mlp": "GLU",
"num_heads": 32,
"num_kv_heads": 0,
"num_layers": 32,
"seq_len": 256,
"vocab_size": 50257,
"weight_tying": true
}