jbloom commited on
Commit
21df74d
1 Parent(s): f984d27

Fix issue where layer 9 SAE was present for layer 8 and layer 9 folders

Browse files
v5_32k_layer_8.pt/cfg.json CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0630a4cc495b767c6d2361a70555620fa4434ee6d97caa0bded047e638e04bbd
3
- size 565
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:11127f08fa2d022a8d0a6514590d24287cbbe67099221713133fc813605bd198
3
+ size 535
v5_32k_layer_8.pt/sae_weights.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8794350d4f1022c7d7d4ca85fbd25e2c054e36533233ff5aafe90ae4bcb6342f
3
  size 201461056
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b56314c681998f6a782c65de9a0969d3634b9ced79021a7ccaa8ec983cae70eb
3
  size 201461056
v5_32k_layer_8.pt/sparsity.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0716c176e7158b8bf20a7f71bbf3f713f2782cabede6bda9c32bfd150512c43e
3
  size 131152
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b8c9e77e9107fea9b066818fa3046161c39a3a606b9318e9328091e2310d3e71
3
  size 131152