Edit model card

Air-Striker-Mixtral-8x7B-ZLoss

Experimental model, trained using config and Transformers/Axolotl forks provided by Doctor-Shotgun

Model was fine-tuned from Mixtral-8x7B-v0.1 with airoboros-3.2 dataset, for 4 epochs, ChatML prompt format at 8K context length.

Downloads last month
15
Inference Examples
Inference API (serverless) has been turned off for this model.

Dataset used to train LoneStriker/Air-Striker-Mixtral-8x7B-ZLoss-2.4bpw-h6-exl2