add 24
Browse files
README.md
CHANGED
@@ -39,8 +39,8 @@ This is abliterated model of [google/gemma-2-2b-jpn-it](https://huggingface.co/g
|
|
39 |
described by mlabonne.
|
40 |
|
41 |
Layer 17 of the original model was chosen for abliteration.
|
42 |
-
I also created another layer 18 abliterated
|
43 |
-
These
|
44 |
after respective layer was abliterated.
|
45 |
|
46 |
It is uploaded here to be evaluated by the Open LLM Leaderboard to see how brain damaged it
|
@@ -57,6 +57,7 @@ Click on the model name go to the raw score json generated by Open LLM Leaderboa
|
|
57 |
| [gemma-2-2b-jpn-it](https://huggingface.co/datasets/open-llm-leaderboard/results/blob/main/google/gemma-2-2b-jpn-it/results_2024-10-15T15-21-39.173019.json) | 30.82 | 54.11 | 41.43 | 0.0 | 27.52 | 37.17 | 24.67 |
|
58 |
| [gemma-2-2b-jpn-it-abliterated-17](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-17/results_2024-10-18T15-18-46.821674.json) | 30.29 | 52.65 | 40.46 | 0.0 | 27.18 | 36.90 | 24.55 |
|
59 |
| [gemma-2-2b-jpn-it-abliterated-18](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-18/results_2024-10-18T15-41-42.399571.json) | 30.61 | 53.02 | 40.96 | 0.0 | 27.35 | 37.30 | 25.05 |
|
|
|
60 |
|
61 |
It is only slightly dumber than the original.
|
62 |
|
|
|
39 |
described by mlabonne.
|
40 |
|
41 |
Layer 17 of the original model was chosen for abliteration.
|
42 |
+
I also created another layer 18 and 24 abliterated models for comparison.
|
43 |
+
These three layers were chosen due to they both produce uncensored response
|
44 |
after respective layer was abliterated.
|
45 |
|
46 |
It is uploaded here to be evaluated by the Open LLM Leaderboard to see how brain damaged it
|
|
|
57 |
| [gemma-2-2b-jpn-it](https://huggingface.co/datasets/open-llm-leaderboard/results/blob/main/google/gemma-2-2b-jpn-it/results_2024-10-15T15-21-39.173019.json) | 30.82 | 54.11 | 41.43 | 0.0 | 27.52 | 37.17 | 24.67 |
|
58 |
| [gemma-2-2b-jpn-it-abliterated-17](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-17/results_2024-10-18T15-18-46.821674.json) | 30.29 | 52.65 | 40.46 | 0.0 | 27.18 | 36.90 | 24.55 |
|
59 |
| [gemma-2-2b-jpn-it-abliterated-18](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-18/results_2024-10-18T15-41-42.399571.json) | 30.61 | 53.02 | 40.96 | 0.0 | 27.35 | 37.30 | 25.05 |
|
60 |
+
| [gemma-2-2b-jpn-it-abliterated-24](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-24/results_2024-10-25T16-29-46.542899.json) | 30.61 | 51.37 | 40.77 | 0.0 | 27.77 | 39.02 | 24.73 |
|
61 |
|
62 |
It is only slightly dumber than the original.
|
63 |
|