Edit model card

TinyMix-8x1b

This model is MoE consisting of 8 experts of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T

This model is untrained, and will likely perform worse than the dense version.

Will start training it very soon.

Idea by eastwind, who did it for the chat version of the model.

Downloads last month
10
Safetensors
Model size
6.43B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.