Edit model card

GPT2 in Swahili

This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.

How to use

from transformers import AutoTokenizer, AutoModelWithLMHead
  
tokenizer = AutoTokenizer.from_pretrained("flax-community/gpt2-swahili")

model = AutoModelWithLMHead.from_pretrained("flax-community/gpt2-swahili")

print(round((model.num_parameters())/(1000*1000)),"Million Parameters")

124 Million Parameters

Training Data:

This model was trained on Swahili Safi

More Details:

For more details and Demo please check HF Swahili Space

Downloads last month
27
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train flax-community/gpt2-swahili