File size: 922 Bytes
822f240
def4a00
 
 
 
d06050a
 
def4a00
 
822f240
d06050a
 
822f240
 
 
 
 
 
d06050a
822f240
d06050a
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
---
base_model: hibana2077/Pioneer-2x7B
inference: false
library_name: transformers
merged_models:
- HuggingFaceH4/mistral-7b-grok
- OpenPipe/mistral-ft-optimized-1218
pipeline_tag: text-generation
quantized_by: Suparious
tags:
- mergekit
- merge
- 4-bit
- AWQ
- text-generation
- autotrain_compatible
- endpoints_compatible
---
# hibana2077/Pioneer-2x7B AWQ

- Model creator: [hibana2077](https://huggingface.co/hibana2077)
- Original model: [Pioneer-2x7B](https://huggingface.co/hibana2077/Pioneer-2x7B)

## Model Summary

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

This model was merged using the SLERP merge method.

The following models were included in the merge:
* [HuggingFaceH4/mistral-7b-grok](https://huggingface.co/HuggingFaceH4/mistral-7b-grok)
* [OpenPipe/mistral-ft-optimized-1218](https://huggingface.co/OpenPipe/mistral-ft-optimized-1218)