This model is a finetune of jondurbin's excellent bagel model. It has been trained with new datasets and a new technique, which we will share to the community soon. This model has not utilised any form of merging.
Evaluation Results
Average | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K |
---|---|---|---|---|---|---|
77.29 | 74.23 | 86.76 | 76.66 | 70.22 | 83.66 | 72.18 |
Contamination Results
With reference model jondurbin/bagel-34b-v0.2:
ARC | TruthfulQA | GSM8K |
---|---|---|
0.08 | 0.38 | 0.88 |
Vanilla Quantization by nold, Original Model abacusai/Smaug-34B-v0.1. Created using llm-quantizer Pipeline - 465d7970507dcaac4cb50221157a68c840965774
- Downloads last month
- 50
Model tree for nold/Smaug-34B-v0.1-GGUF
Base model
jondurbin/bagel-34b-v0.2