--- license: apache-2.0 tags: - merge - not-for-all-audiences --- ## **BigMaid-20B-v1.0** [exllamav2](https://github.com/turboderp/exllamav2) quant for [TeeZee/BigMaid-20B-v1.0](https://huggingface.co/TeeZee/BigMaid-20B-v1.0) Should run on 12 GB of VRAM cards in webui with context length set to 4096, ExLlamav2_HF loader and cache_8bit=True All comments are greatly appreciated, download, test and if you appreciate my work, consider buying me my fuel: Buy Me A Coffee