quant of brucethemoose's Yi-34B-200K-DARE-merge-v5
fits into 24gb with 16k context on windows
pippa_cleaned used for calibration, with 8192 token length
- Downloads last month
- 11
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.