Edit model card

T3Q-Merge-Mistral7B

T3Q-Merge-Mistral7B is a merge of the following models using mergekit:

Model Developers Chihoon Lee(chlee10), T3Q

  
  slices:
      - sources:
        - model: liminerity/M7-7b
          layer_range: [0, 32]
        - model: yam-peleg/Experiment26-7B
          layer_range: [0, 32]
  
  merge_method: slerp
  base_model: liminerity/M7-7b
  
  parameters:
   t:
      - filter: self_attn
        value: [0, 0.5, 0.3, 0.7, 1]
      - filter: mlp
        value: [1, 0.5, 0.7, 0.3, 0]
      - value: 0.5 # fallback for rest of tensors
      
  dtype: bfloat16
Downloads last month
59
Safetensors
Model size
7.24B params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.