Misted-Toppy-11B
Thanks to mradermacher for the quants! You can find them here: mradermacher/Misted-Toppy-11B-GGUF
My first model ever. Roast me.
It's a frankenmerge between Walmart-the-bag/Misted-v2-7B and Undi95/Toppy-M-7B. Seems to work alright. Tested for like 2 seconds. Kinda dumb tho. Decent for RP.
It's also censored AF. Bruh
Anyway, have a nice day!
- Downloads last month
- 13
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.