PixieZehirNano / README.md
Maani's picture
Update README.md
b7ac85d verified
|
raw
history blame
1.76 kB
metadata
language:
  - en
  - de
  - fr
  - it
  - pt
  - hi
  - es
  - th
  - ja
  - ko
library_name: transformers
license: apache-2.0
pipeline_tag: text-generation

Model Card for Model ID

image/png

Introducing Pixie Zehir Nano.

Excelling in writing.

A fine tune of LLAMA 3.2 3B on HQ DATA™ from Pixie Zehir.

Model Details

  • Developed by: [Maani x BLNKBLK]
  • Language(s) (NLP): [English, German, French, Italian, Spanish, Portugese, Korean, Japanese, Hindi, Thai]
  • License: [Apache 2.0]
  • Finetuned from model : [meta-llama/Llama-3.2-3B-Instruct]

Agreements

Model is created for research purposes, it can and will hallucinate, use with caution.

Usage

!pip install transformers==4.45.0
import torch
from transformers import pipeline
pipe = pipeline(
    "text-generation",
    model="Maani/PixieZehirNano",
    torch_dtype=torch.bfloat16,
    device_map="auto",
)

messages = [
    {"role": "user", "content": "Write a haiku."},
]
prompt = pipe.tokenizer.apply_chat_template(
    messages,
    tokenize=False,
    add_generation_prompt=True,
)
res = pipe(
    prompt,
    max_new_tokens=256,
    temperature=0.7,
    do_sample=True,
)
print(res[0]["generated_text"])
#sample of output
Cutting Knowledge Date: December 2023
Today Date: 30 Sep 2024

<|eot_id|><|start_header_id|>user<|end_header_id|>
Write a haiku.<|eot_id|>

<|start_header_id|>assistant<|end_header_id|>
Crescent moon's soft glow, A path unwinds beneath, Dreams and reality blur

Thanks to mradermacher, You can find the GGUF quantized versions of earlier 1.8B Zehir nano at: mradermacher/PixieZehirNano-GGUF