File size: 413 Bytes
95efdc1 55aa619 95efdc1 55aa619 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
## MechDistilGPT2
This model is fine-tuned on 200k text scraped from Mechanical/Automotive pdf books.
Base model is DistilGPT2(https://huggingface.co/gpt2) (the smallest version of GPT2)
## Fine-Tuning
Default Training Args
Epochs = 3
Perplexity = 48
Training set = 200k sentences
Validation set = 40k sentences
## Framework versions
Transformers 4.7.0.dev0
Pytorch 1.8.1+cu111
Datasets 1.6.2
Tokenizers 0.10.2 |