Etheria-55b-v0.1 / README.md
Steelskull's picture
Update README.md
075b393 verified
|
raw
history blame
1.44 kB
metadata
base_model: []
tags:
  - mergekit
  - merge
  - Etheria

Steelskull/Etheria-55b-v0.1

image/png

Merge Details

An attempt to make a functional goliath style merge to create a [Etheria] 55b-200k with two yi-34b-200k models.

due to the merge it 'theoretically' should have a context of 200k but I recommend starting at 32k and moveing up, as it is unknown (at this time) what the merge has done to the context length.

This is a merge of both VerA and VerB of Etheria-55b (There numbers were surprisingly good), I then created a sacrificial 55B out of the most performant yi-34b-200k Model and performed a Dare_ties merge and equalize the model into its current state.

Merge Method

This model was merged using the DARE TIES merge method using Merged-Etheria-55b as a base.

Configuration

The following YAML configuration was used to produce this model:


base_model: Merged-Etheria-55b
models:
  - model: Sacr-Etheria-55b
    parameters:
      weight: [0.22, 0.113, 0.113, 0.113, 0.113, 0.113]
      density: 0.61
  - model: Merged-Etheria-55b
    parameters:
      weight: [0.22, 0.113, 0.113, 0.113, 0.113, 0.113]
      density: 0.61
merge_method: dare_ties
tokenizer_source: union
parameters:
  int8_mask: true
dtype: bfloat16