metadata
base_model:
- unsloth/Mistral-Small-Instruct-2409
- unsloth/Mistral-Small-Instruct-2409
- rAIfle/Acolyte-LORA
library_name: transformers
tags:
- mergekit
- merge
Acolyte-22B
LoRA of a bunch of random datasets on top of Mistral-Small-Instruct-2409, then SLERPed onto base at 0.5. Decent enough for its size. Check the LoRA for dataset info.
Use Mistral V2 & V3
template.