Edit model card

2.2bpw (high quality loss, only for 24GB vRAM test.)
4.0bpw
6.0bpw
8.0bpw

Llama-3-Swallow-70B-v0.1-exl2

License

META LLAMA 3 COMMUNITY LICENSE

Citations

@misc{llama3swallow,
      title={Llama 3 Swallow},
      url={https://swallow-llm.github.io/llama3-swallow.en.html},
      author={Swallow LLM},
      year={2024},
}
@article{llama3modelcard,
    title={Llama 3 Model Card},
    author={AI@Meta},
    year={2024},
    url = {https://github.com/meta-llama/llama3/blob/main/MODEL_CARD.md}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .