Maixtchup-4x7b / README.md
bnjmnmarie's picture
Update README.md
d2c4f47 verified
|
raw
history blame contribute delete
No virus
962 Bytes
metadata
license: apache-2.0
tags:
  - moe
language:
  - en
library_name: transformers

Model Card for Model ID

This is a mixture of experts created with mergekit and based on mistralai/Mistral-7B-v0.1.

Model Details

Model Description

The method and code used to quantize the model is explained here: Maixtchup: Make Your Own Mixture of Experts with Mergekit

Uses

This model is pre-trained and not fine-tuned. You may fine-tune it with PEFT using adapters.

Model Card Contact

The Kaitchup