Uploaded model
- Developed by: acbdkk
- License: apache-2.0
- Finetuned from model : unsloth/llama-3-8b-bnb-4bit
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.
How to use:
from unsloth import FastLanguageModel
model, tokenizer = FastLanguageModel.from_pretrained(model_name = "acbdkk/lora_model")
Model tree for acbdkk/lora_model
Base model
meta-llama/Meta-Llama-3-8B
Quantized
unsloth/llama-3-8b-bnb-4bit