This model is MoE consisting of 8 experts of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
This model is untrained, and will likely perform worse than the dense version.
Will start training it very soon.
Idea by eastwind, who did it for the chat version of the model.
- Downloads last month
- 10
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.