merge
This is a experimental merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the SLERP merge method.
Models Merged
The following models were included in the merge:
Why this two models?
Because both used models are, to my knowledge, the two best models when it comes to German language generation.
DiscoLM German 7B is is up to this date (01/21/2024) by far the best German model and makes far fewer grammatical errors and his German generally sounds good. But it is finetuned on Mistral V0.2 or even V0.1.
Mistral FT Optimized 1227 is much better in German than Mistral 7B V0.2 and other German fine-tuning models that make grammar errors in almost every sentence. But even that model is a good step behind DiscoLM German 7B and creates not so well formed German sentences.
The ulterior motive was now combining this two models to get a even better German model, especially for German roleplay.
Configuration
The following YAML configuration was used to produce this model:
slices:
- sources:
- model: OpenPipe/mistral-ft-optimized-1227
layer_range: [0, 32]
- model: DiscoResearch/DiscoLM_German_7b_v1
layer_range: [0, 32]
merge_method: slerp
base_model: OpenPipe/mistral-ft-optimized-1227
parameters:
t:
- value: [0.5, 0.9]
dtype: bfloat16
This settings are from the model oshizo/japanese-e5-mistral-7b_slerp.
- Downloads last month
- 18