Model Card for Model ID

image/png

Introducing Pixie Zehir Nano.

Excelling in writing.

A fine tune of LLAMA 3.2 3B on HQ DATA™ from Pixie Zehir.

Model Details

  • Developed by: [Maani x BLNKBLK]
  • Language(s) (NLP): [English, German, French, Italian, Spanish, Portugese, Korean, Japanese, Hindi, Thai]
  • License: [Apache 2.0]
  • Finetuned from model : [meta-llama/Llama-3.2-3B-Instruct]

Agreements

Model is created for research purposes, it can and will hallucinate, use with caution.

Usage

!pip install transformers==4.45.0
import torch
from transformers import pipeline
pipe = pipeline(
    "text-generation",
    model="Maani/PixieZehirNano",
    torch_dtype=torch.bfloat16,
    device_map="auto",
)

messages = [
    {"role": "user", "content": "Write a haiku."},
]
prompt = pipe.tokenizer.apply_chat_template(
    messages,
    tokenize=False,
    add_generation_prompt=True,
)
res = pipe(
    prompt,
    max_new_tokens=256,
    temperature=0.7,
    do_sample=True,
)
print(res[0]["generated_text"])
#sample of output
Cutting Knowledge Date: December 2023
Today Date: 30 Sep 2024

<|eot_id|><|start_header_id|>user<|end_header_id|>
Write a haiku.<|eot_id|>

<|start_header_id|>assistant<|end_header_id|>
Crescent moon's soft glow, A path unwinds beneath, Dreams and reality blur

Thanks to mradermacher, You can find the GGUF quantized versions of earlier 1.8B Zehir nano at: mradermacher/PixieZehirNano-GGUF

Downloads last month
12
Safetensors
Model size
3.21B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Maani/PixieZehirNano

Quantizations
1 model