File size: 750 Bytes
4a8d07d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
---
license: apache-2.0
datasets:
- databricks/databricks-dolly-15k
language:
- en
metrics:
- rouge
base_model:
- openai-community/gpt2-xl
pipeline_tag: text-generation
---
# teacher-gpt2-1.5B
[paper](https://arxiv.org/abs/2306.08543) | [code](https://github.com/microsoft/LMOps/tree/main/minillm)
**teacher-gpt2-1.5B** is a gpt2-xlarge (1.5B) model supervised fine-tuned on [databricks-dolly-15k](https://huggingface.co/datasets/aisquared/databricks-dolly-15k).
It is used as the teacher model for MiniLLM series.
## Citation
```
@inproceedings{minillm,
title={MiniLLM: Knowledge Distillation of Large Language Models},
author={Gu, Yuxian and Dong, Li and Wei, Furu and Huang, Minlie},
booktitle={Proceedings of ICLR},
year={2024}
}
``` |