Edit model card

pijarcandra22/BartIndo2Bali

This model is a fine-tuned version of facebook/bart-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.1151
  • Validation Loss: 2.6202
  • Epoch: 99

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
4.3767 3.6194 0
3.5364 3.1996 1
3.1525 2.9458 2
2.8777 2.8118 3
2.6993 2.6979 4
2.5550 2.6071 5
2.4536 2.5362 6
2.3338 2.4572 7
2.2394 2.3878 8
2.1466 2.3692 9
2.0795 2.3189 10
2.0061 2.2674 11
1.9321 2.2393 12
1.8837 2.2181 13
1.8224 2.2002 14
1.7626 2.1671 15
1.7251 2.1386 16
1.6624 2.1245 17
1.6191 2.1134 18
1.6177 2.1061 19
1.5524 2.0845 20
1.4965 2.0750 21
1.4618 2.0527 22
1.4188 2.0584 23
1.3774 2.0359 24
1.3469 2.0567 25
1.3113 2.0295 26
1.2791 2.0134 27
1.2436 2.0431 28
1.1915 2.0201 29
1.1815 2.0283 30
1.1314 2.0230 31
1.1071 2.0424 32
1.0781 2.0357 33
1.0429 2.0208 34
1.0134 2.0458 35
0.9799 2.0466 36
0.9567 2.0592 37
0.9261 2.0278 38
0.8931 2.0641 39
0.8742 2.0783 40
0.8397 2.0781 41
0.8228 2.1010 42
0.7819 2.1042 43
0.7667 2.1302 44
0.7508 2.1193 45
0.7136 2.1372 46
0.6849 2.1513 47
0.6625 2.1747 48
0.6451 2.1936 49
0.6114 2.1650 50
0.5907 2.2176 51
0.5781 2.2313 52
0.5594 2.2287 53
0.5361 2.2260 54
0.5168 2.2444 55
0.5022 2.2660 56
0.4826 2.2912 57
0.4607 2.2922 58
0.4442 2.2912 59
0.4262 2.3032 60
0.4050 2.3335 61
0.4005 2.3327 62
0.3826 2.3379 63
0.3658 2.3369 64
0.3442 2.3629 65
0.3384 2.3887 66
0.3287 2.3868 67
0.3140 2.3609 68
0.3078 2.4009 69
0.2953 2.4071 70
0.2855 2.4421 71
0.2715 2.4290 72
0.2647 2.4227 73
0.2483 2.4457 74
0.2402 2.4582 75
0.2355 2.4509 76
0.2272 2.4788 77
0.2198 2.4795 78
0.2077 2.4786 79
0.1989 2.5080 80
0.1992 2.4929 81
0.1905 2.5120 82
0.1880 2.5345 83
0.1773 2.5147 84
0.1734 2.5270 85
0.1663 2.5399 86
0.1618 2.5581 87
0.1576 2.5533 88
0.1550 2.5177 89
0.1475 2.5689 90
0.1453 2.5720 91
0.1398 2.5526 92
0.1357 2.5638 93
0.1325 2.5782 94
0.1293 2.6026 95
0.1263 2.6147 96
0.1257 2.6056 97
0.1149 2.6323 98
0.1151 2.6202 99

Framework versions

  • Transformers 4.35.2
  • TensorFlow 2.14.0
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
5
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for pijarcandra22/BartIndo2Bali

Base model

facebook/bart-base
Finetuned
(364)
this model