gpt2-bengali / README.md
khalidsaifullaah's picture
Few changes
dacff11
|
raw
history blame
2.04 kB
metadata
language: bn
license: mit

Bengali GPT-2

Bengali GPT-2 demo. Part of the Huggingface JAX/Flax event. Also features a finetuned model on bengali song lyrics.

Model Description

OpenAI GPT-2 model was proposed in Language Models are Unsupervised Multitask Learners paper .Original GPT2 model was a causal (unidirectional) transformer pretrained using language modeling on a very large corpus of ~40 GB of text data. This model has same configuration but has been pretrained on bengali corpus of mC4(multilingual C4) dataset. The code for training the model has all been open-sourced [here](https://huggingface.co/flax-community/gpt2-bengali/tree/main.

Training Details

Overall Result:

Eval loss : 1.45, Eval Perplexity : 3.141

Data: mC4-bn

Train Steps: 250k steps

link 🤗 flax-community/gpt2-bengali

Demo : https://huggingface.co/spaces/flax-community/Gpt2-bengali

Usage

For using the model there are multiple options available. For example using the pipeline directly we can try to generate sentences.

from transformers import pipeline

gpt2_bengali = pipeline('text-generation',model="flax-community/gpt2-bengali", tokenizer='flax-community/gpt2-bengali')

Similarly for using the finetuned model on bangla songs we can use following.

from transformers import pipeline

singer = pipeline('text-generation',model="khalidsaifullaah/bengali-lyricist-gpt2", tokenizer='khalidsaifullaah/bengali-lyricist-gpt2')

For using on other tasks the model needs to be fine-tuned on custom datasets. Details can be found in huggingface documentation

Contributors

  • Khalid Saifullah
  • Tasmiah Tahsin Mayeesha
  • Ritobrata Ghosh
  • Ibrahim Musa
  • M Saiful Bari