cecilemacaire
commited on
Commit
•
dc9c459
1
Parent(s):
330102a
Update README.md
Browse files
README.md
CHANGED
@@ -62,6 +62,21 @@ fairseq-train \
|
|
62 |
|
63 |
The model was evaluated with BLEU, where we compared the reference pictogram translation with the model hypothesis.
|
64 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
65 |
### Results
|
66 |
|
67 |
Comparison to other translation models :
|
|
|
62 |
|
63 |
The model was evaluated with BLEU, where we compared the reference pictogram translation with the model hypothesis.
|
64 |
|
65 |
+
```bash
|
66 |
+
fairseq-generate exp_commonvoice/data-bin/commonvoice.tokenized.fr-frp \
|
67 |
+
--path exp_commonvoice/checkpoints/nmt_fr_frp_commonvoice/checkpoint.best_bleu_86.0600.pt \
|
68 |
+
--batch-size 128 --beam 5 --remove-bpe > gen_cv.out
|
69 |
+
```
|
70 |
+
The output file prints the following information :
|
71 |
+
```txt
|
72 |
+
S-2724 la planète terre
|
73 |
+
T-2724 le planète_terre
|
74 |
+
H-2724 -0.08702446520328522 le planète_terre
|
75 |
+
D-2724 -0.08702446520328522 le planète_terre
|
76 |
+
P-2724 -0.1058 -0.0340 -0.1213
|
77 |
+
Generate test with beam=5: BLEU4 = 82.60, 92.5/85.5/79.5/74.1 (BP=1.000, ratio=1.027, syslen=138507, reflen=134811)
|
78 |
+
```
|
79 |
+
|
80 |
### Results
|
81 |
|
82 |
Comparison to other translation models :
|