Mistral-4B
🧩 Configuration
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 8]
model: fhai50032/RolePlayLake-7B
- sources:
- layer_range: [11, 12]
model: fhai50032/RolePlayLake-7B
- sources:
- layer_range: [15, 16]
model: fhai50032/RolePlayLake-7B
- sources:
- layer_range: [19, 20]
model: fhai50032/RolePlayLake-7B
- sources:
- layer_range: [24, 25]
model: fhai50032/RolePlayLake-7B
- sources:
- layer_range: [28, 32]
model: fhai50032/RolePlayLake-7B
- Downloads last month
- 44
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.