File size: 3,728 Bytes
c947671
02add02
 
 
 
 
 
 
 
 
c947671
 
02add02
 
 
 
 
 
12841c6
02add02
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
license: apache-2.0
base_model: Helsinki-NLP/opus-mt-lg-en
tags:
- generated_from_trainer
metrics:
- bleu
model-index:
- name: Helsinki_lg_inf_en
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/hnamuwaya-makerere-university-business-school/Helsinki_lg_inf_en/runs/9s0x0mb9)
# Helsinki_lg_inf_en

This model is a fine-tuned version of [Helsinki-NLP/opus-mt-lg-en](https://huggingface.co/Helsinki-NLP/opus-mt-lg-en) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1764
- Bleu: 22.8149
- Gen Len: 17.7776

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Bleu    | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|
| No log        | 1.0   | 153  | 0.5662          | 0.6031  | 20.9541 |
| No log        | 2.0   | 306  | 0.5118          | 0.8533  | 20.1433 |
| No log        | 3.0   | 459  | 0.4754          | 1.1179  | 19.9124 |
| 0.6777        | 4.0   | 612  | 0.4452          | 1.4213  | 20.2326 |
| 0.6777        | 5.0   | 765  | 0.4181          | 1.7245  | 19.2424 |
| 0.6777        | 6.0   | 918  | 0.3940          | 2.0655  | 19.5872 |
| 0.463         | 7.0   | 1071 | 0.3722          | 2.6043  | 19.2969 |
| 0.463         | 8.0   | 1224 | 0.3512          | 3.4014  | 18.864  |
| 0.463         | 9.0   | 1377 | 0.3323          | 4.0558  | 19.0541 |
| 0.3973        | 10.0  | 1530 | 0.3150          | 4.9264  | 18.878  |
| 0.3973        | 11.0  | 1683 | 0.2989          | 6.1751  | 18.1102 |
| 0.3973        | 12.0  | 1836 | 0.2845          | 6.909   | 18.405  |
| 0.3973        | 13.0  | 1989 | 0.2708          | 8.2081  | 18.1388 |
| 0.3476        | 14.0  | 2142 | 0.2589          | 9.0267  | 18.1527 |
| 0.3476        | 15.0  | 2295 | 0.2477          | 9.8007  | 18.1826 |
| 0.3476        | 16.0  | 2448 | 0.2374          | 11.2825 | 17.9705 |
| 0.309         | 17.0  | 2601 | 0.2282          | 12.38   | 17.9427 |
| 0.309         | 18.0  | 2754 | 0.2200          | 13.1971 | 18.2629 |
| 0.309         | 19.0  | 2907 | 0.2127          | 14.6993 | 18.0356 |
| 0.278         | 20.0  | 3060 | 0.2058          | 15.8696 | 17.7944 |
| 0.278         | 21.0  | 3213 | 0.2001          | 17.2214 | 17.656  |
| 0.278         | 22.0  | 3366 | 0.1951          | 18.3989 | 17.6769 |
| 0.2597        | 23.0  | 3519 | 0.1906          | 19.6026 | 17.7543 |
| 0.2597        | 24.0  | 3672 | 0.1869          | 20.6405 | 17.817  |
| 0.2597        | 25.0  | 3825 | 0.1835          | 20.7913 | 17.7273 |
| 0.2597        | 26.0  | 3978 | 0.1809          | 21.5904 | 17.7518 |
| 0.2452        | 27.0  | 4131 | 0.1789          | 21.9249 | 17.69   |
| 0.2452        | 28.0  | 4284 | 0.1775          | 22.3964 | 17.6953 |
| 0.2452        | 29.0  | 4437 | 0.1767          | 22.6803 | 17.7547 |
| 0.2379        | 30.0  | 4590 | 0.1764          | 22.8149 | 17.7776 |


### Framework versions

- Transformers 4.42.3
- Pytorch 2.1.2
- Datasets 2.20.0
- Tokenizers 0.19.1