Mistral
Collection
MistralAI-based models
•
365 items
•
Updated
•
1
This is a merge of pre-trained language models created using mergekit.
DO NOT USE IN CURRENT FORM!!!
It's spitting out random gibberish.
This model was merged using the SLERP merge method.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
slices:
- sources:
- model: spow12/ChatWaifu_12B_v2.0
layer_range: [0, 32]
- model: Gryphe/Pantheon-RP-1.6.1-12b-Nemo
layer_range: [0, 32]
merge_method: slerp
base_model: Gryphe/Pantheon-RP-1.6.1-12b-Nemo
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16