miqu-1-120b-GGUF / README.md
wolfram's picture
Update README.md
f1538db verified
|
raw
history blame
4.06 kB
metadata
base_model:
  - 152334H/miqu-1-70b-sf
language:
  - en
  - de
  - fr
  - es
  - it
library_name: transformers
tags:
  - mergekit
  - merge

miqu-1-120b-GGUF

image/jpeg

This is a 120b frankenmerge of miqu-1-70b created by interleaving layers of miqu-1-70b-sf with itself using mergekit.

Inspired by Venus-120b-v1.2, MegaDolphin-120b, and goliath-120b.

Thanks for the support, CopilotKit - the open-source platform for building in-app AI Copilots into any product, with any LLM model. Check out their GitHub.

Thanks for the EXL2 and GGUF quants, Lone Striker!

Prompt template: Mistral

<s>[INST] {prompt} [/INST]

See also: 🐺🐦‍⬛ LLM Prompt Format Comparison/Test: Mixtral 8x7B Instruct with 17 different instruct templates : LocalLLaMA

Model Details

  • Max Context: 32764 tokens (kept the weird number from the original/base model)
  • Layers: 140

Merge Details

Merge Method

This model was merged using the passthrough merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

dtype: float16
merge_method: passthrough
slices:
- sources:
  - layer_range: [0, 20]
    model: 152334H/miqu-1-70b-sf
- sources:
  - layer_range: [10, 30]
    model: 152334H/miqu-1-70b-sf
- sources:
  - layer_range: [20, 40]
    model: 152334H/miqu-1-70b-sf
- sources:
  - layer_range: [30, 50]
    model: 152334H/miqu-1-70b-sf
- sources:
  - layer_range: [40, 60]
    model: 152334H/miqu-1-70b-sf
- sources:
  - layer_range: [50, 70]
    model: 152334H/miqu-1-70b-sf
- sources:
  - layer_range: [60, 80]
    model: 152334H/miqu-1-70b-sf

Credits & Special Thanks

Support

  • My Ko-fi page if you'd like to tip me to say thanks or request specific models to be tested or merged with priority. Also consider supporting your favorite model creators, quantizers, or frontend/backend devs if you can afford to do so. They deserve it!

DISCLAIMER: THIS IS BASED ON A LEAKED ASSET AND HAS NO LICENSE ASSOCIATED WITH IT. USE AT YOUR OWN RISK.