PsyMedNethena-20B / README.md
Elfrino's picture
Upload folder using huggingface_hub
e3a1877 verified
metadata
base_model:
  - NeverSleep/Nethena-20B
  - Undi95/PsyMedRP-v1-20B
library_name: transformers
tags:
  - mergekit
  - merge

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

slices:
  - sources:
      - model: Undi95/PsyMedRP-v1-20B
        layer_range: [0, 62]  # PsyMedRP has 62 layers
      - model: NeverSleep/Nethena-20B
        layer_range: [0, 62]  # Nethena-20B has 62 layers
merge_method: slerp  # Changing to SLERP method
base_model: Undi95/PsyMedRP-v1-20B  # Focus on reasoning from PsyMedRP
parameters:
  t:
    - filter: self_attn
      value: [.3, .6, .9, .6, .3]  # smooth gradient of focus
      value: [.3, .6, .9, .6, .3]  # consistent level of creativity and abstract reasoning
    - value: 0.639 
dtype: bfloat16  # Use preferred dtype