File size: 649 Bytes
42d2240
 
 
 
 
4394462
e85bfc3
42d2240
4394462
e85bfc3
1
2
3
4
5
6
7
8
9
10
---
library_name: transformers
tags: []
---

This was an experiment.
I got the delta between [mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated](https://huggingface.co/mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated) and [meta-llama/Llama-3.1-8B-Instruct](https://huggingface.co/meta-llama/Llama-3.1-8B-Instruct) and applied that on the common layers from [ICTNLP/Llama-3.1-8B-Omni](https://huggingface.co/ICTNLP/Llama-3.1-8B-Omni).

The intention was to see if the Omni model can gain abliterated functions.
The result (this model) is coherent, but it's not 100% uncensored. The reason most probably has to do with the way the Omni model was trained.