Macaron-7B-v2 / README.md
shadowml's picture
Upload folder using huggingface_hub
e0e40e8
|
raw
history blame
897 Bytes
metadata
license: apache-2.0
tags:
  - merge
  - mergekit
  - OpenPipe/mistral-ft-optimized-1218
  - mlabonne/NeuralHermes-2.5-Mistral-7B

Macaron-7B-v2

This model is a merge of the following models made with mergekit:

🧩 Configuration

models:
  - model: mistralai/Mistral-7B-v0.1
    # no parameters necessary for base model
  - model: OpenPipe/mistral-ft-optimized-1218
    parameters:
      density: 0.5
      weight: 0.5
  - model: mlabonne/NeuralHermes-2.5-Mistral-7B
    parameters:
      density: 0.5
      weight: 0.3
merge_method: ties
base_model: mistralai/Mistral-7B-v0.1
parameters:
  normalize: true
  int8_mask: true
dtype: float16