Add note that this is the smallest version of the model

#18
Files changed (1) hide show
  1. README.md +4 -0
README.md CHANGED
@@ -34,6 +34,10 @@ This way, the model learns an inner representation of the English language that
34
  useful for downstream tasks. The model is best at what it was pretrained for however, which is generating texts from a
35
  prompt.
36
 
 
 
 
 
37
  ## Intended uses & limitations
38
 
39
  You can use the raw model for text generation or fine-tune it to a downstream task. See the
 
34
  useful for downstream tasks. The model is best at what it was pretrained for however, which is generating texts from a
35
  prompt.
36
 
37
+ This is the **smallest** version of GPT-2, with 124M parameters.
38
+
39
+ **Related Models:** [GPT-Large](https://huggingface.co/gpt2-large), [GPT-Medium](https://huggingface.co/gpt2-medium) and [GPT-XL](https://huggingface.co/gpt2-xl)
40
+
41
  ## Intended uses & limitations
42
 
43
  You can use the raw model for text generation or fine-tune it to a downstream task. See the