Flux-Base-Optimized
flux-base-optimized
is the base model for finetuning the series of flux-7b
models.
It is hierarchical SLERP merged from the following models
- mistralai/Mistral-7B-v0.1 (Apache 2.0)
- teknium/OpenHermes-2.5-Mistral-7B (Apache 2.0)
- Intel/neural-chat-7b-v3-3 (Apache 2.0)
- meta-math/MetaMath-Mistral-7B (Apache 2.0)
- openchat/openchat-3.5-0106 was openchat/openchat-3.5-1210 (Apache 2.0)
Here's how we did the hierarchical SLERP merge.
[flux-base-optimized]
↑
|
[stage-1]-+-[openchat]
↑
|
[stage-0]-+-[meta-math]
↑
|
[openhermes]-+-[neural-chat]
- Downloads last month
- 73
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.