Alpaca LoRA MT
Collection
Alpaca LoRA MT models and dataset
•
11 items
•
Updated
•
1
This model is a fine-tuned version of decapoda-research/llama-30b-hf on the HiTZ/alpaca_mt ['en', 'pt', 'es', 'ca', 'eu', 'gl', 'at'] dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.1695 | 0.04 | 100 | 1.1716 |
1.1211 | 0.07 | 200 | 1.0964 |
1.0591 | 0.11 | 300 | 1.0590 |
1.0234 | 0.14 | 400 | 1.0341 |
1.0345 | 0.18 | 500 | 1.0165 |
0.9932 | 0.22 | 600 | 1.0024 |
0.9948 | 0.25 | 700 | 0.9895 |
1.01 | 0.29 | 800 | 0.9794 |
0.9488 | 0.32 | 900 | 0.9708 |
0.9518 | 0.36 | 1000 | 0.9627 |
0.9463 | 0.4 | 1100 | 0.9557 |
0.956 | 0.43 | 1200 | 0.9498 |
0.9521 | 0.47 | 1300 | 0.9437 |
0.9345 | 0.51 | 1400 | 0.9385 |
0.9469 | 0.54 | 1500 | 0.9337 |
0.9466 | 0.58 | 1600 | 0.9297 |
0.9403 | 0.61 | 1700 | 0.9257 |
0.9179 | 0.65 | 1800 | 0.9219 |
0.9468 | 0.69 | 1900 | 0.9190 |
0.9173 | 0.72 | 2000 | 0.9163 |
0.9172 | 0.76 | 2100 | 0.9142 |
0.9351 | 0.79 | 2200 | 0.9124 |
0.9238 | 0.83 | 2300 | 0.9110 |
0.9057 | 0.87 | 2400 | 0.9099 |
0.9309 | 0.9 | 2500 | 0.9093 |
0.8893 | 0.94 | 2600 | 0.9090 |
0.9095 | 0.97 | 2700 | 0.9088 |