Edit model card

M2M100 418M

M2M100 is a multilingual encoder-decoder transformer model trained for Many-to-Many multilingual translation. The model, originally introduced by researchers at Facebook, demonstrates impressive performance in cross-lingual translation tasks. For a better understanding of M2M100 you can look into the paper and the associated repository.
To further enhance the capabilities of M2M100, we conducted finetuning experiments on English-to-Arabic parallel text. The finetuning process involved training the model for 1000K steps using a batch size of 8.

Downloads last month
6
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train khalidalt/m2m100_418M-finetuned-en-to-ar