🧪⛙ Merged Models
Collection
A collection of merged models.
•
11 items
•
Updated
•
1
🌟 Buying me coffee is a direct way to show support for this project.
Mixnueza-6x32M-MoE is a Mixure of Experts (MoE) made with the following models using LazyMergekit:
from transformers import pipeline
generate = pipeline("text-generation", "Isotonic/Mixnueza-6x32M-MoE")
messages = [
{
"role": "system",
"content": "You are a helpful assistant who answers the user's questions with details and curiosity.",
},
{
"role": "user",
"content": "What are some potential applications for quantum computing?",
},
]
prompt = generate.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
output = generate(
prompt,
max_new_tokens=256,
do_sample=True,
temperature=0.65,
top_k=35,
top_p=0.55,
repetition_penalty=1.176,
)
print(output[0]["generated_text"])