metadata
license: apache-2.0
datasets:
- Open-Orca/SlimOrca
tags:
- mistral
- finetune
Merge of teknium/OpenHermes-2.5-Mistral-7B and Intel/neural-chat-7b-v3-1 using ties merge.
Weights
Density
Prompt Templates
You can use these prompt templates, but I recommend using ChatML.
ChatML (OpenHermes-2.5-Mistral-7B):
<|im_start|>system
{system}
<|im_start|>user
{user}<|im_end|>
<|im_start|>assistant
{asistant}<|im_end|>
OpenHermes-2.5-Mistral-7B:
### System:
{system}
### User:
{usr}
### Assistant:
Quantizationed versions
Quantizationed versions of this model is available thanks to TheBloke.
GPTQ
GGUF
AWQ
Open LLM Leaderboard Evaluation Results
Detailed results can be found here
Metric | Value |
---|---|
Avg. | 58.6 |
ARC (25-shot) | 66.55 |
HellaSwag (10-shot) | 84.47 |
MMLU (5-shot) | 63.34 |
TruthfulQA (0-shot) | 61.22 |
Winogrande (5-shot) | 78.37 |
GSM8K (5-shot) | 23.58 |
DROP (3-shot) | 32.66 |