Text Generation
Adapters
Safetensors
mixtral
Edit model card

Model Card for Model Swisslex/Mixtral-Orca-v0.1

Model Details

Model Description

Finetuned version of mistralai/Mixtral-8x7B-v0.1 using SFT and DPO.

  • Developed by: Swisslex
  • Language(s) (NLP): English, German, French, Italian, Spanish
  • License: apache-2.0
  • Finetuned from model [optional]: mistralai/Mixtral-8x7B-v0.1
Downloads last month
0
Safetensors
Model size
46.7B params
Tensor type
BF16
·
Inference Examples
Inference API (serverless) does not yet support adapter-transformers models for this pipeline type.

Datasets used to train Swisslex/Mixtral-Orca-v0.1