File size: 439 Bytes
8541f63 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
---
license: mit
datasets:
- wikitext
language:
- en
metrics:
- accuracy
library_name: transformers
tags:
- general
---
Quantized GPT2 model.
Generative Pre-trained Transformer 2 (GPT-2) is a large language model by OpenAI and the second in their foundational series of GPT models. GPT-2 was pre-trained on BookCorpus, a dataset of over 7,000 unpublished fiction books from various genres, and trained on a dataset of 8 million web pages. |