2.2bpw (high quality loss, only for 24GB vRAM test.)
4.0bpw
6.0bpw
8.0bpw
Llama-3-Swallow-70B-v0.1-exl2
- Model creator: tokyotech-llm
- Original model: Llama-3-Swallow-70B-v0.1
License
META LLAMA 3 COMMUNITY LICENSE
Citations
@misc{llama3swallow,
title={Llama 3 Swallow},
url={https://swallow-llm.github.io/llama3-swallow.en.html},
author={Swallow LLM},
year={2024},
}
@article{llama3modelcard,
title={Llama 3 Model Card},
author={AI@Meta},
year={2024},
url = {https://github.com/meta-llama/llama3/blob/main/MODEL_CARD.md}
}