main releases
Collection
powerful small models aimed to become good at chat/text while avoiding the usage of system prompts
•
4 items
•
Updated
•
2
This model has improved overall performance at the expense of small degradation on winogrande. As all palmer models, the model is biased to respond to answers without using any specific prompt, feel free to further fine-tune it for your specific use case.
Model | MMLU | ARC-C | HellaSwag | PIQA | Winogrande | Average |
---|---|---|---|---|---|---|
tinyllama-3t | 0.2577 | 0.3029 | 0.5935 | 0.7329 | 0.5959 | 0.4966 |
palmer-004-old | 0.2601 | 0.3456 | 0.6138 | 0.7443 | 0.6511 | 0.5229 |
palmer-004 | 0.2661 | 0.3490 | 0.6173 | 0.7481 | 0.6417 | 0.5244 |
Even though palmer-003 is only 2k context size, it's 0.5257 on average so if you don't need that much context size (32k) you are better off with the former one.