Loki-v6 / README.md
MrRobotoAI's picture
Upload folder using huggingface_hub
ffa0f79 verified
metadata
base_model:
  - OmnicromsBrain/NeuralStar_Fusion-7B
  - MrRobotoAI/Loki-v5.2
  - OmnicromsBrain/Eros_Scribe-7b
  - OmnicromsBrain/StoryFusion-7B
  - MrRobotoAI/Frejya-0.8e
  - OmnicromsBrain/EverythingBagel-DPO-7B
  - MrRobotoAI/test0011
  - OmnicromsBrain/ToppyCox-7B
library_name: transformers
tags:
  - mergekit
  - merge

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the linear DARE merge method using MrRobotoAI/test0011 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

 models:

  - model: OmnicromsBrain/Eros_Scribe-7b
    parameters:
      weight: 0.1
      density: 0.9
  - model: OmnicromsBrain/NeuralStar_Fusion-7B
    parameters:
      weight: 0.1
      density: 0.9
  - model: OmnicromsBrain/StoryFusion-7B
    parameters:
      weight: 0.1
      density: 0.9
  - model: OmnicromsBrain/EverythingBagel-DPO-7B
    parameters:
      weight: 0.1
      density: 0.9
  - model: OmnicromsBrain/ToppyCox-7B
    parameters:
      weight: 0.1
      density: 0.9
  - model: MrRobotoAI/Loki-v5.2
    parameters:
      weight: 0.25
      density: 0.9
  - model: MrRobotoAI/Frejya-0.8e
    parameters:
      weight: 0.25
      density: 0.9
      
 merge_method: dare_linear
 base_model: MrRobotoAI/test0011
 parameters:
  normalize: false
 dtype: float16