metadata
datasets:
- tatsu-lab/alpaca
language:
- en
metrics:
- accuracy
base_model: openai-community/gpt2
pipeline_tag: text-generation
library_name: transformers
This model is a fine-tuned version of gpt2 on an tatsu-lab/alpaca dataset. It achieves the following results on the evaluation set:
Loss: 1.826895