File size: 1,921 Bytes
b9e1f2a d024c83 b3c057a 5a9a5ac d024c83 b3c057a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 |
---
license: cc-by-nc-4.0
---
Don't mind those at the moment, I need to finetune them for RP, it's just some tests.
WARNING: This model specifically need EOS token I completely forgot to put on the json files, and need to check what was the right ones trough the mix. Please don't use it like this if you really want to review it.
```
slices:
- sources:
- model: "/content/drive/MyDrive/CC-v1.1-7B-bf16"
layer_range: [0, 24]
- sources:
- model: "/content/drive/MyDrive/Zephyr-7B"
layer_range: [8, 32]
merge_method: passthrough
dtype: bfloat16
================================================
slices:
- sources:
- model: "/content/drive/MyDrive/Mistral-11B-CC-Zephyr"
layer_range: [0, 48]
- model: Undi95/Mistral-11B-OpenOrcaPlatypus
layer_range: [0, 48]
merge_method: slerp
base_model: "/content/drive/MyDrive/Mistral-11B-CC-Zephyr"
parameters:
t:
- value: 0.5 # fallback for rest of tensors
dtype: bfloat16
```
hf-causal-experimental (pretrained=/content/drive/MyDrive/Mistral-11B-Test), limit: None, provide_description: False, num_fewshot: 0, batch_size: 4
| Task |Version| Metric |Value | |Stderr|
|-------------|------:|--------|-----:|---|-----:|
|arc_challenge| 0|acc |0.5623|± |0.0145|
| | |acc_norm|0.5794|± |0.0144|
|arc_easy | 0|acc |0.8354|± |0.0076|
| | |acc_norm|0.8165|± |0.0079|
|hellaswag | 0|acc |0.6389|± |0.0048|
| | |acc_norm|0.8236|± |0.0038|
|piqa | 0|acc |0.8139|± |0.0091|
| | |acc_norm|0.8264|± |0.0088|
|truthfulqa_mc| 1|mc1 |0.3978|± |0.0171|
| | |mc2 |0.5607|± |0.0155|
|winogrande | 0|acc |0.7451|± |0.0122|
![image/png](https://cdn-uploads.huggingface.co/production/uploads/63ab1241ad514ca8d1430003/8f-rAHIfN1ZuW4HqkzYz-.png) |