regular-sae / config_layer_9_32.json
charlieoneill's picture
Upload config_layer_9_32.json with huggingface_hub
7203ad2 verified
raw
history blame
337 Bytes
{
"layer": 9,
"model_type": "GatedSAE",
"n_batches": 5000,
"l1_coefficient": 2,
"projection_up": 32,
"batch_size": 64,
"learning_rate": 0.001,
"test_loss": 61.70772094726563,
"reconstruction_error": 27.958398628234864,
"l0_loss": 12.370556640625,
"dead_neurons_percentage": 92.23225911458334
}