thesven's picture
Update README.md
671469c verified
metadata
license: mit
datasets:
  - Replete-AI/code_bagel

Phi-nut-Butter-Codebagel-v1

image/png

Model Details

Model Name: Phi-nut-Butter-Codebagel-v1 Quantization Data: 4bit GPTQ

Quantization Details

This is a GPTQ 4 bit quantization of thesven/Phi-nut-Butter-Codebagel-v1. For more details on the model please see the model card.

Intended Use

This model is designed to improve instruction-following capabilities, particularly for code-related tasks.

Getting Started

Instruct Template

<|system|>
{system_message} <|end|>
<|user|>
{Prompt) <|end|>
<|assistant|>

Transfromers

from transformers import AutoModelForCausalLM, AutoTokenizer

model_name_or_path = "thesven/Phi-nut-Butter-Codebagel-v1-GPTQ"

tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
model = AutoModelForCausalLM.from_pretrained(
    model_name_or_path,
    device_map="auto",
    trust_remote_code=False,
    revision="main",
)
model.pad_token = model.config.eos_token_id

prompt_template = '''
<|system|>
You are an expert developer. Please help me with any coding questions.<|end|>
<|user|>
In typescript how would I use a function that looks like this <T>(config:T):T<|end|>
<|assistant|>
'''
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.1, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=256)

generated_text = tokenizer.decode(output[0, len(input_ids[0]):], skip_special_tokens=True)
display(generated_text)