metadata
base_model: mistralai/Mistral-7B-Instruct-v0.3
datasets:
- nroggendorff/eap
language:
- en
license: mit
tags:
- trl
- sft
- art
- code
- adam
- mistral
model-index:
- name: eap
results: []
pipeline_tag: text-generation
Edgar Allen Poe LLM
EAP is a language model fine-tuned on the EAP dataset using Supervised Fine-Tuning (SFT) and Teacher Reinforced Learning (TRL) techniques. It is based on the Mistral 7b Model
Features
- Utilizes SFT and TRL techniques for improved performance
- Supports English language
Usage
To use the LLM, you can load the model using the Hugging Face Transformers library:
from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig
import torch
bnb_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_use_double_quant=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch.bfloat16
)
model_id = "nroggendorff/mistral-eap"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id, quantization_config=bnb_config)
prompt = "[INST] Write a poem about tomatoes in the style of Poe.[/INST]"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs)
generated_text = tokenizer.batch_decode(outputs)[0]
print(generated_text)
License
This project is licensed under the MIT License.