## GPT-2 for Skript | |
## Complete your Skript automatically via a finetuned GPT-2 model | |
`0.57` Training loss on about 2 epochs (in total) | |
1.2 million lines of Skript is inside the dataset. | |
Inference Colab: https://colab.research.google.com/drive/1ujtLt7MOk7Nsag3q-BYK62Kpoe4Lr4PE |