Translation
Transformers
PyTorch
nllb-moe
feature-extraction
pinzhenchen commited on
Commit
4ba95a1
1 Parent(s): 83c96e4

change the model in the text description to match code and model card.

Browse files

"facebook/nllb-200-distilled-600M" changed to "facebook/nllb-moe-54b"

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -235,7 +235,7 @@ Safiyyah Saleem, Holger Schwenk, and Jeff Wang.
235
  The avalable checkpoints requires around 350GB of storage. Make sure to use `accelerate` if you do not have enough RAM on your machine.
236
 
237
  While generating the target text set the `forced_bos_token_id` to the target language id. The following
238
- example shows how to translate English to French using the *facebook/nllb-200-distilled-600M* model.
239
 
240
  Note that we're using the BCP-47 code for French `fra_Latn`. See [here](https://github.com/facebookresearch/flores/blob/main/flores200/README.md#languages-in-flores-200)
241
  for the list of all BCP-47 in the Flores 200 dataset.
 
235
  The avalable checkpoints requires around 350GB of storage. Make sure to use `accelerate` if you do not have enough RAM on your machine.
236
 
237
  While generating the target text set the `forced_bos_token_id` to the target language id. The following
238
+ example shows how to translate English to French using the *facebook/nllb-moe-54b* model.
239
 
240
  Note that we're using the BCP-47 code for French `fra_Latn`. See [here](https://github.com/facebookresearch/flores/blob/main/flores200/README.md#languages-in-flores-200)
241
  for the list of all BCP-47 in the Flores 200 dataset.