SAELens
File size: 1,127 Bytes
461c104
4c9854f
81a8e3e
461c104
 
e84cdde
38a38ca
b41ba08
dea719f
b41ba08
dea719f
b41ba08
dea719f
b41ba08
461c104
b41ba08
f948ce5
b41ba08
349b43a
b41ba08
461c104
c221462
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
---
license: cc-by-4.0
library_name: saelens
---

! WARNING: the first set of SAEs are uploading now (31st July 5PM UK time), and we expect them to all land in 30 mins. In the mean time the 2B SAEs, e.g. https://huggingface.co/google/gemma-scope-2b-pt SAEs should be available.

# 1. Gemma Scope

Gemma Scope is a comprehensive, open suite of sparse autoencoders for Gemma 2 9B and 2B. Sparse Autoencoders are a "microscope" of sorts that can help us break down a model’s internal activations into the underlying concepts, just as biologists use microscopes to study the individual cells of plants and animals.

See our [landing page](https://huggingface.co/google/gemma-scope) for details on the whole suite. This is a specific set of SAEs:

# 2. What Is `gemma-scope-9b-pt-res`?

- `gemma-scope-`: See 1.
- `9b-pt-`: These SAEs were trained on Gemma v2 9B base model.
- `res`: These SAEs were trained on the model's residual stream.

## 3. Point of Contact

Point of contact: Arthur Conmy

Contact by email:

```python
''.join(list('moc.elgoog@ymnoc')[::-1])
```

HuggingFace account:
https://huggingface.co/ArthurConmyGDM