ErebusNeuralSamir-7B-dare-ties
ErebusNeuralSamir-7B-dare-ties is a merge of the following models using mergekit:
- samir-fama/SamirGPT-v1
- mlabonne/NeuralHermes-2.5-Mistral-7B
- KoboldAI/Mistral-7B-Erebus-v3
🧩 Configuration
\```yaml models:
- model: mistralai/Mistral-7B-v0.1
No parameters necessary for base model
- model: samir-fama/SamirGPT-v1 parameters: density: 0.53 weight: 0.3
- model: mlabonne/NeuralHermes-2.5-Mistral-7B parameters: density: 0.53 weight: 0.3
- model: KoboldAI/Mistral-7B-Erebus-v3 parameters: density: 0.53 weight: 0.4 merge_method: dare_ties base_model: mistralai/Mistral-7B-v0.1 parameters: int8_mask: true dtype: bfloat16
\```
- Downloads last month
- 74
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.