Edit model card

mlsae-pythia-70m-deduped-x128-k32-tfm

A Multi-Layer Sparse Autoencoder (MLSAE) trained on the residual stream activation vectors from every layer of EleutherAI/pythia-70m-deduped with an expansion factor of 128 and k = 32, over 1 billion tokens from monology/pile-uncopyrighted. This model includes the underlying transformer.

For more details, see:

Downloads last month
48
Safetensors
Model size
138M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Dataset used to train tim-lawson/mlsae-pythia-70m-deduped-x128-k32-tfm

Collection including tim-lawson/mlsae-pythia-70m-deduped-x128-k32-tfm