rockymtn_gpt2 / README.md
kmcowan's picture
Update README.md
bf1c276 verified
metadata
license: apache-2.0

rockymtn_gpt GPT-2 Model

This is a GPT-2 model fine-tuned on novels and books of varying topics. It is designed to generate text based on the input prompt.

Model Details

  • Model Name: rockymtn_gpt2
  • Base Model: Custom
  • Fine-tuned on: 90 books of varying topics
  • Training Details: Further training to continue.

Usage

You can use this model with the transformers library:

from transformers import GPT2Tokenizer, GPT2LMHeadModel

tokenizer = GPT2Tokenizer.from_pretrained("kmcowan/rockymtn_gpt2")
model = GPT2LMHeadModel.from_pretrained("kmcowan/rockymtn_gpt2")

input_text = "The future of AI is"
input_ids = tokenizer.encode(input_text, return_tensors='pt')
outputs = model.generate(input_ids, max_length=50)
generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)

print(generated_text)