2PRYMMAL-Yi1.5-6B-SLERP
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the SLERP merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
slices:
- sources:
- model: 01-ai/Yi-1.5-6B-Chat
layer_range: [0, 32]
- model: 01-ai/Yi-1.5-6B
layer_range: [0, 32]
merge_method: slerp
base_model: 01-ai/Yi-1.5-6B-Chat
parameters:
t:
- filter: self_attn
value: [0, 0.25, 0.5, 0.75, 1]
- filter: mlp
value: [1, 0.75, 0.5, 0.25, 0]
- value: 0.5
dtype: bfloat16
## Licence
Ce modèle est mis à disposition sous la Licence Apache 2.0. Vous pouvez voir une copie de cette licence à l'adresse suivante : [http://www.apache.org/licenses/LICENSE-2.0](http://www.apache.org/licenses/LICENSE-2.0).
- Downloads last month
- 26
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.