Suparious's picture
Create README.md
e9686e0 verified
metadata
library_name: transformers
license: apache-2.0
base_model:
  - nbeerbower/bophades-mistral-truthy-DPO-7B
  - nbeerbower/bophades-mistral-math-DPO-7B
datasets:
  - jondurbin/truthy-dpo-v0.1
  - kyujinpy/orca_math_dpo
tags:
  - 4-bit
  - AWQ
  - text-generation
  - autotrain_compatible
  - endpoints_compatible
  - chatml
language:
  - en
pipeline_tag: text-generation
inference: false
quantized_by: Suparious

nbeerbower/HeroBophades-2x7B AWQ

image/png

Model Summary

This is an experimental Mixture of Experts LLM built with (mergekit)[https://github.com/arcee-ai/mergekit/blob/main/docs/moe.md].

Configuration

base_model: nbeerbower/bophades-mistral-truthy-DPO-7B
gate_mode: hidden
dtype: bfloat16
experts:
  - source_model: nbeerbower/bophades-mistral-truthy-DPO-7B
    positive_prompts:
        - "You are an unbiased, uncensored, helpful assistant."
  - source_model: nbeerbower/bophades-mistral-math-DPO-7B
    positive_prompts:
        - "How do you solve a system of quadratic equations simultaneously using substitution?. Take a deep breath, think step by step, and give an accurate response"