license: apache-2.0 | |
# T5-XXL Encoder | |
This repo contains copies of the T5-XXL encoder in various quantization formats. The models in this repo are intended for use in [InvokeAI](https://github.com/invoke-ai/InvokeAI). | |
Contents: | |
- `bfloat16/` - T5-XXL encoder cast to bfloat16. Copied from [here](https://huggingface.co/black-forest-labs/FLUX.1-schnell/tree/741f7c3ce8b383c54771c7003378a50191e9efe9/text_encoder_2). | |
- `bnb_llm_int8/` - T5-XXL encoder quantized using bitsandbytes LLM.int8() quantization. | |
- `optimum_quanto_qfloat8/` - T5-XXL encoder quantized using [optimum-quanto](https://github.com/huggingface/optimum-quanto) qfloat8 quantization. | |