SAELens
File size: 804 Bytes
461c104
 
 
 
c221462
dea719f
c221462
dea719f
c221462
dea719f
c221462
 
461c104
 
 
 
349b43a
461c104
 
 
349b43a
461c104
 
c221462
 
461c104
 
c221462
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
---
license: apache-2.0
---

# 1. Gemmascope

Gemmascope is TODO

# 2. What is `gemmascope-9b-pt-res`?

- `gemmascope-`: see 1.
- `9b-`: these SAEs were trained on Gemma v2 9B (TODO link)

## FAQ

Q1: Why does this model exist in `gg-hf`?

A1: See https://docs.google.com/document/d/1bKaOw2mJPJDYhgFQGGVOyBB3M4Bm_Q3PMrfQeqeYi0M (Google internal only).

Q2: What does "SAE" mean?

A2: Sparse Autoencoder. See https://docs.google.com/document/d/1roMgCPMPEQgaNbCu15CGo966xRLToulCBQUVKVGvcfM (should be available to trusted HuggingFace collaborators, and Google too).

TODO(conmy): remove this when making the main repo.

## Point of Contact

Point of contact: Arthur Conmy

Contact by email:

```python
''.join(list('moc.elgoog@ymnoc')[::-1])
```

HuggingFace account:
https://huggingface.co/ArthurConmyGDM