Edit model card

This is a MicroBERT model for Wolof.

  • Its suffix is -mx, which means that it was pretrained using supervision from masked language modeling and XPOS tagging.
  • The unlabeled Wolof data was taken from a February 2022 dump of Uyghur Wikipedia, totaling 517,237 tokens.
  • The UD treebank UD_Wolof-WDT, v2.9, totaling 44,258 tokens, was used for labeled data.

Please see the repository and the paper for more details.

Downloads last month
9
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.