The Mayonnaise
Collection
A collection of 7B models made with mergekit.
β’
6 items
β’
Updated
This is a mixture of experts created with mergekit and based on mistralai/Mistral-7B-v0.1.
The model was created using a recipe detailed in this article: The Mayonnaise: Rank First on the Open LLM Leaderboard with TIES-Merging
Created with mergekit with this configuration:
models:
- model: mncai/mistral-7b-dpo-v5
# no parameters necessary for base model
- model: kaitchup/Mayonnaise-4in1-02
parameters:
density: 0.5
weight: 0.3
- model: BarryFutureman/NeuralTurdusVariant1-7B
parameters:
density: 0.5
weight: 0.5
merge_method: ties
base_model: mncai/mistral-7b-dpo-v5
parameters:
normalize: true
dtype: float16
Detailed results can be found here
Metric | Value |
---|---|
Avg. | 74.94 |
AI2 Reasoning Challenge (25-Shot) | 73.46 |
HellaSwag (10-Shot) | 88.46 |
MMLU (5-Shot) | 64.88 |
TruthfulQA (0-shot) | 69.19 |
Winogrande (5-shot) | 84.29 |
GSM8k (5-shot) | 69.37 |