File size: 225 Bytes
9121146 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
---
datasets:
- ehartford/WizardLM_alpaca_evol_instruct_70k_unfiltered
language:
- en
---
# WizardLM finetuned on the MPT-7B model
Trained 3 epochs on 1 x A100 80GB
https://wandb.ai/wing-lian/mpt-wizard-7b/runs/2agnd9fz
|