aashish1904's picture
Upload README.md with huggingface_hub
b342410 verified
|
raw
history blame
1.57 kB
---
license: apache-2.0
tags:
- merge
- mergekit
- lazymergekit
- nothingiisreal/MN-12B-Celeste-V1.9
- elinas/Chronos-Gold-12B-1.0
---
![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXeiuCm7c8lEwEJuRey9kiVZsRn2W-b4pWlu3-X534V3YmVuVc2ZL-NXg2RkzSOOS2JXGHutDuyyNAUtdJI65jGTo8jT9Y99tMi4H4MqL44Uc5QKG77B0d6-JfIkZHFaUA71-RtjyYZWVIhqsNZcx8-OMaA?key=xt3VSDoCbmTY7o-cwwOFwQ)
# QuantFactory/MN-12B-Chronos-Gold-Celeste-v1-GGUF
This is quantized version of [ThomasComics/MN-12B-Chronos-Gold-Celeste-v1](https://huggingface.co/ThomasComics/MN-12B-Chronos-Gold-Celeste-v1) created using llama.cpp
# Original Model Card
# MN-12B-Chronos-Gold-Celeste-v1
MN-12B-Chronos-Gold-Celeste-v1 is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
* [nothingiisreal/MN-12B-Celeste-V1.9](https://huggingface.co/nothingiisreal/MN-12B-Celeste-V1.9)
* [elinas/Chronos-Gold-12B-1.0](https://huggingface.co/elinas/Chronos-Gold-12B-1.0)
## 🧩 Configuration
```yaml
base_model: elinas/Chronos-Gold-12B-1.0
parameters:
int8_mask: true
rescale: true
normalize: false
merge_method: della
dtype: bfloat16
models:
- model: nothingiisreal/MN-12B-Celeste-V1.9
parameters:
density: [0.4, 0.5, 0.6, 0.4, 0.6, 0.5, 0.4]
epsilon: [0.15, 0.15, 0.25, 0.15, 0.15]
lambda: 0.85
weight: [0.6, 0.5, 0.4, 0.6, 0.4, 0.5, 0.6]
- model: elinas/Chronos-Gold-12B-1.0
parameters:
density: [0.45, 0.55, 0.45, 0.55, 0.45]
epsilon: [0.1, 0.1, 0.25, 0.1, 0.1]
lambda: 0.85
weight: [0.55, 0.45, 0.55, 0.45, 0.55]
```