Edit model card

GPT2-Codeparrot

Generative Pre-trained Transformer 2 (GPT-2) is a large language model from OpenAI that was first introduced in gpt2. It is a decoder-only Transformer model trained using a masked language modeling (MLM) objective. This means the model is trained to predict the next word in a sequence, given the previous words. GPT-2 models are known for their ability to generate realistic and coherent text, making them useful for a variety of natural language processing tasks such as text generation, translation, and question answering.

Model description

This model is a base GPT-2 architecture with [insert number] parameters. It was trained on the huggingface-course/codeparrot-ds-valid dataset, which is a small subset of the original WebText dataset used to train GPT-2. Due to the limited training data, this model may not perform as well as other pre-trained GPT-2 models available on Hugging Face.

Intended uses & limitations

This model is intended for personal learning and exploration of the GPT-2 architecture. Due to its limited training data, it may not be suitable for real-world applications.

Training and evaluation data

This model was trained using the Transformers library with the following specifications:

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 1
  • mixed_precision_training: Native AMP

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
17
Safetensors
Model size
124M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for kailasps/GPT2-codeparrot

Finetuned
(1172)
this model

Dataset used to train kailasps/GPT2-codeparrot