aashish1904's picture
Upload README.md with huggingface_hub
3844bd7 verified
---
base_model:
- bunnycore/LLama-3.1-Hyper-Stock
- bunnycore/LLama-3.1-8B-HyperNova-abliteration
- bunnycore/LLama-3.1-8b-Ultra-Max-Pro
- bunnycore/Llama-3.1-8B-OmniMatrix
- leafspark/Llama-3.1-8B-MultiReflection-Instruct
library_name: transformers
tags:
- mergekit
- merge
---
[![QuantFactory Banner](https://lh7-rt.googleusercontent.com/docsz/AD_4nXeiuCm7c8lEwEJuRey9kiVZsRn2W-b4pWlu3-X534V3YmVuVc2ZL-NXg2RkzSOOS2JXGHutDuyyNAUtdJI65jGTo8jT9Y99tMi4H4MqL44Uc5QKG77B0d6-JfIkZHFaUA71-RtjyYZWVIhqsNZcx8-OMaA?key=xt3VSDoCbmTY7o-cwwOFwQ)](https://hf.co/QuantFactory)
# QuantFactory/LLama-3.1-8B-Matrix-GGUF
This is quantized version of [bunnycore/LLama-3.1-8B-Matrix](https://huggingface.co/bunnycore/LLama-3.1-8B-Matrix) created using llama.cpp
# Original Model Card
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [bunnycore/LLama-3.1-Hyper-Stock](https://huggingface.co/bunnycore/LLama-3.1-Hyper-Stock) as a base.
### Models Merged
The following models were included in the merge:
* [bunnycore/LLama-3.1-8B-HyperNova-abliteration](https://huggingface.co/bunnycore/LLama-3.1-8B-HyperNova-abliteration)
* [bunnycore/LLama-3.1-8b-Ultra-Max-Pro](https://huggingface.co/bunnycore/LLama-3.1-8b-Ultra-Max-Pro)
* [bunnycore/Llama-3.1-8B-OmniMatrix](https://huggingface.co/bunnycore/Llama-3.1-8B-OmniMatrix)
* [leafspark/Llama-3.1-8B-MultiReflection-Instruct](https://huggingface.co/leafspark/Llama-3.1-8B-MultiReflection-Instruct)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: leafspark/Llama-3.1-8B-MultiReflection-Instruct
- model: bunnycore/Llama-3.1-8B-OmniMatrix
- model: bunnycore/LLama-3.1-8B-HyperNova-abliteration
- model: bunnycore/LLama-3.1-8b-Ultra-Max-Pro
merge_method: model_stock
base_model: bunnycore/LLama-3.1-Hyper-Stock
dtype: bfloat16
```