File size: 618 Bytes
17d2774 9955fd1 17d2774 9955fd1 17d2774 9955fd1 17d2774 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
---
datasets:
- ehartford/dolphin
license: apache-2.0
---
**Base Model :** iamplus/mpt-30b-v2
**Tool :** MosaicML's llm-foundry (https://github.com/mosaicml/llm-foundry)
**Dataset :** Entire flan1m-GPT4 dataset
**Config yaml with Model Params :** https://huggingface.co/iamplus/mpt-30b-v3/blob/main/mpt-30b_orca.yaml
***Description :*** **mosaicml/mpt-30b** -> Finetuning on (Entire flan3m-GPT3.5 dataset for 1 epoch) -> **iamplus/mpt-30b-v2** -> Finetuning on (Entire flan1m-GPT4 dataset for 1 epoch) -> **iamplus/mpt-30b-v3**
**Prompt Format :**
```
<system>: [system prompt]
<human>: [question]
<bot>:
``` |