Edit model card

Chronomaid-Storytelling-13b

exllamav2 quant for NyxKrage/Chronomaid-Storytelling-13b

Runs smoothly on single 3090 in webui with context length set to 4096, ExLlamav2_HF loader and cache_8bit=True

All comments are greatly appreciated, download, test and if you appreciate my work, consider buying me my fuel: Buy Me A Coffee

Downloads last month
17
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including TeeZee/chronomaid-storytelling-13B-bpw8.0-h8-exl2