nbeerbower's picture
Upload folder using huggingface_hub
34ddbd7 verified
|
raw
history blame
1.76 kB
metadata
library_name: transformers
license: apache-2.0
base_model:
  - nbeerbower/Flammen-Mahou-mistral-7B-v2
  - nbeerbower/Mahou-mistral-slerp-7B
tags:
  - mergekit
  - merge

image/png

Mahou-1.3-M1-mistral-7B

Mahou is designed to provide short messages in a conversational context. It is capable of casual conversation and character roleplay.

Chat Format

This model has been trained to use ChatML format. Note the additional tokens in tokenizer_config.json.

<|im_start|>system
{{system}}<|im_end|>
<|im_start|>{{char}}
{{message}}<|im_end|>
<|im_start|>{{user}}
{{message}}<|im_end|>

Roleplay Format

  • Speech without quotes.
  • Actions in *asterisks*
*leans against wall cooly* so like, i just casted a super strong spell at magician academy today, not gonna lie, felt badass.

ST Settings

  1. Use ChatML for the Context Template.
  2. Enable Instruct Mode.
  3. Use the Mahou preset.
  4. Recommended: Add newline as a stopping string: ["\n"]

Method

The following YAML configuration was used to produce this model using mergekit:

models:
  - model: nbeerbower/Flammen-Mahou-mistral-7B-v2
    layer_range: [0, 32]
  - model: nbeerbower/Mahou-mistral-slerp-7B
    layer_range: [0, 32]
merge_method: slerp
base_model: nbeerbower/Mahou-mistral-slerp-7B
parameters:
  t:
    - filter: self_attn
      value: [0, 0.5, 0.3, 0.7, 1]
    - filter: mlp
      value: [1, 0.5, 0.7, 0.3, 0]
    - value: 0.5
dtype: bfloat16