sbmaruf commited on
Commit
3d14a58
2 Parent(s): e854b6d b4d5d3e

Merge branch 'main' of https://huggingface.co/flax-community/bengali-t5-base into main

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -22,10 +22,11 @@ The model is trained on around ~11B tokens (64 size batch, 512 tokens, 350k step
22
  ## load model
23
 
24
  ```
25
- config = T5Config.from_pretrained("flax-community/bengali-t5-base")
26
- model = FlaxT5ForConditionalGeneration.from_pretrained("flax-community/bengali-t5-base", config=config)
27
  ```
28
 
 
29
  Please note that we haven't finetuned the model in any downstream task. If you are finetuning the model in any downstream task, please let us know about it. Shoot us an email (sbmaruf at gmail dot com)
30
 
31
  ## Proposal
 
22
  ## load model
23
 
24
  ```
25
+ >>> config = T5Config.from_pretrained("flax-community/bengali-t5-base")
26
+ >>> model = FlaxT5ForConditionalGeneration.from_pretrained("flax-community/bengali-t5-base", config=config)
27
  ```
28
 
29
+ The model is trained on `de-noising` objectives followed by the script [here](https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py). Currently This model doesn't have any generation capability. If you want this model to have generation capability, please do a finetuning on `prefix-LM` objective mentioned in the [paper](https://arxiv.org/abs/1910.10683).
30
  Please note that we haven't finetuned the model in any downstream task. If you are finetuning the model in any downstream task, please let us know about it. Shoot us an email (sbmaruf at gmail dot com)
31
 
32
  ## Proposal