Edit model card

AIOX Lab and SI2M Lab INSEA have joined forces to offer researchers, industrialists and the NLP (Natural Language Processing) community the first intelligent Open Source system that understands Moroccan dialectal language "Darija".

DarijaBERT is the first BERT model for the Moroccan Arabic dialect called “Darija”. It is based on the same architecture as BERT-base, but without the Next Sentence Prediction (NSP) objective. This model was trained on a total of ~3 Million sequences of Darija dialect representing 691MB of text or a total of ~100M tokens.

The model was trained on a dataset issued from three different sources:

  • Stories written in Darija scrapped from a dedicated website
  • Youtube comments from 40 different Moroccan channels
  • Tweets crawled based on a list of Darija keywords.

More details about DarijaBert are available in the dedicated GitHub repository

Loading the model

The model can be loaded directly using the Huggingface library:

from transformers import AutoTokenizer, AutoModel
DarijaBERT_tokenizer = AutoTokenizer.from_pretrained("SI2M-Lab/DarijaBERT")
DarijaBert_model = AutoModel.from_pretrained("SI2M-Lab/DarijaBERT")

Citation

If you use our models for your scientific publication, or if you find the resources in this repository useful, please cite our paper as follows (to be updated):

@article{gaanoun2023darijabert,
  title={Darijabert: a Step Forward in Nlp for the Written Moroccan Dialect},
  author={Gaanoun, Kamel and Naira, Abdou Mohamed and Allak, Anass and Benelallam, Imade},
  year={2023}
}

Acknowledgments

We gratefully acknowledge Google’s TensorFlow Research Cloud (TRC) program for providing us with free Cloud TPUs.

Downloads last month
973
Safetensors
Model size
209M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for SI2M-Lab/DarijaBERT

Finetunes
6 models