metadata
language:
- en
library_name: transformers
tags:
- gpt
- llm
- stablelm
inference: true
license: cc-by-sa-4.0
This model is a task_arithmetic merge of pansophic/rocket-3B, jondurbin/airoboros-3b-3p0 and Aryanne/Astrohermes-3B, as shown in the yaml(see Astrorocketboros.yml or below).
I used this model as base ayoubkirouane/StableLM-3B
I'm not sure if all the .json files are all right, but it seems to work at the moment.
merge_method: task_arithmetic
base_model: ayoubkirouane/StableLM-3B
models:
- model: ayoubkirouane/StableLM-3B
- model: pansophic/rocket-3B
parameters:
weight: 1.0
- model: Aryanne/Astrohermes-3B
parameters:
weight: 0.22
- model: jondurbin/airoboros-3b-3p0
parameters:
weight: 0.1
dtype: float16
I recommend the use of chatml prompt format, but alpaca seems to work too but it's necessary to write something before the instruction:
You are an Assistant
### Instruction:
write a poem.
### Response:
Amidst the tranquil hues of twilight's hue,
As shadows stretch and dance with whispered dew,
The trees weave tales of ages past untold,
GGUF Quants: notyet