LLAMA 3 8B with capable to output Traditional Chinese
✨ Recommend using LMStudio for this model
I tried using Ollama to run it, but it became quite delulu.
So for now, I'm sticking with LMStudio :)The performance isn't actually that great, but it's capable of answering some basic questions. Sometimes it just acts really dumb though :(
LLAMA 3.1 can actually output pretty well Chinese, so this repo can be ignored.
- Downloads last month
- 33
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for suko/Meta-Llama-3-8B-CHT
Base model
meta-llama/Meta-Llama-3-8B
Quantized
unsloth/llama-3-8b-bnb-4bit