Psyonic-Cetacean V2
Collection
A smoothed-over version of jebcarter's psyonic-cetacean uploaded on his behalf.
•
2 items
•
Updated
This is a merge of pre-trained language models created using mergekit.
GGUF (standard and iMatrix) quants can be found here courtesy of MarsupialAI: https://huggingface.co/MarsupialAI/Psyonic-Cetacean-20b-v2_iMatrix_GGUF
This model was merged using the linear merge method on two stack-merged models.
The first is jebcarter/psyonic-cetacean-20B (Orca first, reproduced so I didn't have to download that model on top of the components). The second is the same recipe with the models reversed.
Since jebcarter suggested this recipe, credit goes to him.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: microsoft/Orca-2-13b
parameters:
weight: 1.0
merge_method: task_arithmetic
base_model: TheBloke/Llama-2-13B-fp16
dtype: float16
name: FlatOrca2
---
slices:
- sources:
- model: FlatOrca2
layer_range: [0, 16]
- sources:
- model: KoboldAI/LLaMA2-13B-Psyfighter2
layer_range: [8, 24]
- sources:
- model: FlatOrca2
layer_range: [17, 32]
- sources:
- model: KoboldAI/LLaMA2-13B-Psyfighter2
layer_range: [25, 40]
merge_method: passthrough
dtype: float16
name: Psycet
---
slices:
- sources:
- model: KoboldAI/LLaMA2-13B-Psyfighter2
layer_range: [0, 16]
- sources:
- model: FlatOrca2
layer_range: [8, 24]
- sources:
- model: KoboldAI/LLaMA2-13B-Psyfighter2
layer_range: [17, 32]
- sources:
- model: FlatOrca2
layer_range: [25, 40]
merge_method: passthrough
dtype: float16
name: Psycet-Reverse
---
models:
- model: Psycet
parameters:
weight: 0.5
- model: Psycet-Reverse
parameters:
weight: 0.5
merge_method: linear
dtype: float16