Edit model card

Sera Llama v0.1

This a finetune of Llama3.1-8B on custom tool call to be used as an agent for a personal assistant, network admin. It generates structured outputs with a tool call based on the user's input, without the need to add lengthy system message.

How to use

Use with transformers

# pip install -U accelerate bitsandbytes

import transformers

tokenizer = transformers.AutoTokenizer.from_pretrained("Sera-Network/sera-llama-3.1-8b-0.1")

# 4-bit quantization to run on smaller gpus
model = transformers.AutoModelForCausalLM.from_pretrained("Sera-Network/sera-llama-3.1-8b-0.1", device_map="auto", quantization_config=transformers.BitsAndBytesConfig(load_in_4bit=True))

# Warm up the model
input_ids = tokenizer.apply_chat_template([{"role": "user", "content": "What's the capital of France?"}], add_generation_prompt=True, return_tensors='pt').to('cuda')
output = model.generate(input_ids, do_sample=False, max_new_tokens=128)
preds = output[:, input_ids.shape[1]:]
text = tokenizer.decode(preds[0], skip_special_tokens=True)

# Create a helper function to generate text based on user input
def generate(user_input: str):
    input_ids = tokenizer.apply_chat_template([{"role": "user", "content": user_input}], add_generation_prompt=True, return_tensors='pt').to('cuda')
    output = model.generate(input_ids, do_sample=False, max_new_tokens=128)
    preds = output[:, input_ids.shape[1]:]
    text = tokenizer.decode(preds[0], skip_special_tokens=True)
    return text

Example output

generate("What's the capital of Swizerland and Germany?")
# The capital of Switzerland is Bern. The capital of Germany is Berlin.

generate("Set up a host for the domain symbiont.me")
# [{"name": "add_host", "parameters": {"hostname": "symbiont.me"}}]

generate("Send an email to my friend Andrej wishing him a happy birthday.")
# [{"name": "send_email", "parameters": {"subject": "Happy Birthday", "body": "Dear Andrej, happy birthday! Best regards, [Your Name]"}}]

generate("Schedule a call with my manager tomorrow 7 am to discuss my promotion.")
# [{"name": "schedule_call", "parameters": {"date": "2024-07-27", "time": "07:00:00", "topic": "promotion"}}]
Downloads last month
2
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for amgadhasan/sera-llama-3.1-8b-0.1

Finetuned
(450)
this model