Meta-Llama-3-2x8B-Instruct-MoE-64k-ctx / mergekit_moe_config.yml
perlthoughts's picture
Upload folder using huggingface_hub
e56d389 verified
raw
history blame contribute delete
289 Bytes
base_model: /content/llama-3-8b-instruct
gate_mode: random
experts:
- source_model: /content/llama-3-8b-instruct
positive_prompts: [""]
- source_model: /content/llama-3-8b-instruct
positive_prompts: [""]
tokenizer_source:
-model:/content/llama-3-8b-instruct
dtype: bfloat16