Edit model card

Introduction

CodeKobzar13B is a generative model that was trained on Ukrainian Wikipedia data and Ukrainian language rules. It has knowledge of Ukrainian history, language, literature and culture.

Model Information

This model is based on vicuna-13b-v1.5.

Model Usage

Use the following prompt template:
USER: {input} ASSISTANT:

We recommend using next configurations:

Temperature: 0.8
Top-p: 0.95

Inference

import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
model_path="ponoma16/CodeKobzar13B"

tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(
                            model_load_path,
                            low_cpu_mem_usage=True,
                            torch_dtype=torch.float16,
                            load_in_8bit=True,
                            device_map='auto',
                        )
model.eval()

prompt = "Яке місто в Україні називають найромантичнішим?"

PROMPT_TEMPLATE = """USER: {prompt} ASSISTANT: """

input_ids = tokenizer(
                prompt,
                return_tensors="pt",
                truncation=True,
            ).input_ids.cuda()
outputs = model.generate(
            input_ids=input_ids,
            do_sample=True,
            top_p=0.95,
            max_new_tokens=150,
            temperature=0.5,
        )
prediction = tokenizer.batch_decode(outputs.cpu().numpy(), skip_special_tokens=True)[0]
print(prediction)

Contact

If you have any inquiries, please feel free to raise an issue or reach out to us via email at: [email protected], [email protected]. We're here to assist you!"

Downloads last month
15
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ponoma16/CodeKobzar13B

Quantizations
1 model

Datasets used to train ponoma16/CodeKobzar13B