Edit model card

rockymtn_gpt GPT-2 Model

This is a GPT-2 model fine-tuned on novels and books of varying topics. It is designed to generate text based on the input prompt.

Model Details

  • Model Name: rockymtn_gpt2
  • Base Model: Custom
  • Fine-tuned on: 90 books of varying topics
  • Training Details: Further training to continue.

Usage

You can use this model with the transformers library:

from transformers import GPT2Tokenizer, GPT2LMHeadModel

tokenizer = GPT2Tokenizer.from_pretrained("kmcowan/rockymtn_gpt2")
model = GPT2LMHeadModel.from_pretrained("kmcowan/rockymtn_gpt2")

input_text = "The future of AI is"
input_ids = tokenizer.encode(input_text, return_tensors='pt')
outputs = model.generate(input_ids, max_length=50)
generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)

print(generated_text)
Downloads last month
30
Safetensors
Model size
124M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .