Edit model card

Labira/LabiraPJOK_1_500

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0001
  • Validation Loss: 8.9416
  • Epoch: 492

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 1500, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
0.0054 8.3302 0
0.0108 7.8442 1
0.0114 7.0958 2
0.0284 6.6490 3
0.0179 7.3034 4
0.0044 8.1785 5
0.0070 8.4039 6
0.0038 8.2728 7
0.0028 8.1154 8
0.0140 8.1207 9
0.0160 8.1384 10
0.0029 8.2978 11
0.0112 8.6940 12
0.0100 8.7433 13
0.0062 8.6486 14
0.0059 8.4821 15
0.0055 8.4559 16
0.0039 8.5136 17
0.0044 8.2783 18
0.0016 8.0974 19
0.0094 7.9739 20
0.0020 8.2513 21
0.0008 8.4637 22
0.0039 8.2813 23
0.0017 8.2027 24
0.0018 8.2722 25
0.0015 8.3875 26
0.0013 8.4975 27
0.0013 8.6171 28
0.0009 8.7272 29
0.0010 8.8335 30
0.0007 8.9168 31
0.0007 8.9992 32
0.0006 9.0661 33
0.0007 9.1103 34
0.0004 9.1424 35
0.0008 9.1573 36
0.0006 9.1666 37
0.0008 9.1732 38
0.0004 9.1781 39
0.0006 9.1867 40
0.0005 9.1986 41
0.0005 9.2203 42
0.0005 9.2512 43
0.0006 9.2889 44
0.0005 9.3360 45
0.0007 9.3759 46
0.0004 9.4144 47
0.0006 9.4461 48
0.0004 9.4718 49
0.0005 9.5113 50
0.0004 9.5425 51
0.0003 9.5667 52
0.0015 9.5468 53
0.0003 9.4515 54
0.0005 9.3881 55
0.0006 9.3797 56
0.0006 9.3887 57
0.0003 9.4038 58
0.0004 9.4206 59
0.0003 9.4417 60
0.0003 9.4627 61
0.0003 9.4775 62
0.0004 9.4930 63
0.0009 9.5593 64
0.0003 9.6068 65
0.0003 9.6416 66
0.0003 9.6715 67
0.0003 9.6956 68
0.0004 9.7146 69
0.0010 9.7344 70
0.0002 9.7946 71
0.0003 9.7965 72
0.0034 9.7113 73
0.0004 9.5730 74
0.0005 9.4858 75
0.0009 9.5826 76
0.0006 9.6923 77
0.0005 9.8243 78
0.0005 9.9368 79
0.0007 10.0514 80
0.0006 10.1386 81
0.0010 10.1427 82
0.0005 9.9261 83
0.0011 9.8122 84
0.0003 9.8724 85
0.0081 9.5494 86
0.0151 8.3043 87
0.0425 9.1449 88
0.0076 8.8560 89
0.0113 8.2403 90
0.0446 7.5457 91
0.0264 7.4204 92
0.1545 8.0820 93
0.3878 8.2238 94
0.4155 6.1718 95
0.0410 5.0625 96
0.0768 4.8214 97
0.0514 4.8477 98
0.0150 5.2002 99
0.0328 5.6224 100
0.0260 5.9887 101
0.0040 6.2793 102
0.0076 6.3696 103
0.0013 6.3642 104
0.0075 6.4379 105
0.0015 6.6379 106
0.0010 6.7736 107
0.0023 6.8582 108
0.0056 6.8884 109
0.0011 6.9125 110
0.0014 6.9437 111
0.0014 6.9807 112
0.0010 7.0239 113
0.0006 7.0602 114
0.0006 7.0919 115
0.0005 7.1213 116
0.0008 7.1457 117
0.0006 7.1679 118
0.0009 7.1871 119
0.0288 7.3166 120
0.0007 7.1397 121
0.0033 6.9025 122
0.0020 6.8509 123
0.0068 6.9533 124
0.0066 7.2446 125
0.0035 7.5351 126
0.0019 7.7354 127
0.0021 7.8376 128
0.0007 7.9071 129
0.0012 7.9566 130
0.0009 8.0014 131
0.0013 8.0186 132
0.0015 8.0123 133
0.0009 7.9870 134
0.0008 7.9685 135
0.0005 7.9599 136
0.0005 7.9553 137
0.0005 7.9574 138
0.0005 7.9631 139
0.0010 7.9780 140
0.0006 7.9910 141
0.0006 8.0078 142
0.0004 8.0283 143
0.0006 8.0500 144
0.0005 8.0704 145
0.0008 8.0899 146
0.0003 8.1078 147
0.0003 8.1243 148
0.0005 8.1384 149
0.0005 8.1534 150
0.0003 8.1678 151
0.0003 8.1827 152
0.0002 8.1955 153
0.0004 8.2093 154
0.0003 8.2218 155
0.0003 8.2338 156
0.0003 8.2454 157
0.0003 8.2566 158
0.0004 8.2696 159
0.0006 8.2696 160
0.0003 8.2700 161
0.0003 8.2745 162
0.0004 8.2834 163
0.0004 8.2918 164
0.0003 8.3035 165
0.0004 8.3182 166
0.0005 8.3357 167
0.0003 8.3499 168
0.0002 8.3616 169
0.0005 8.3759 170
0.0003 8.3901 171
0.0002 8.4020 172
0.0004 8.4105 173
0.0004 8.4120 174
0.0005 8.4166 175
0.0003 8.4209 176
0.0003 8.4287 177
0.0011 8.4219 178
0.0005 8.3854 179
0.0003 8.3589 180
0.0003 8.3630 181
0.0002 8.3680 182
0.0003 8.3735 183
0.0003 8.3812 184
0.0003 8.3882 185
0.0003 8.3937 186
0.0002 8.3989 187
0.0003 8.4022 188
0.0003 8.4048 189
0.0003 8.4102 190
0.0004 8.4197 191
0.0003 8.4328 192
0.0004 8.4468 193
0.0002 8.4609 194
0.0011 8.4712 195
0.0003 8.4735 196
0.0002 8.4789 197
0.0007 8.4928 198
0.0002 8.5066 199
0.0003 8.5205 200
0.0003 8.5368 201
0.0003 8.5531 202
0.0002 8.5676 203
0.0002 8.5815 204
0.0003 8.5989 205
0.0003 8.6161 206
0.0001 8.6305 207
0.0003 8.6473 208
0.0003 8.6626 209
0.0003 8.6764 210
0.0002 8.6899 211
0.0002 8.7019 212
0.0002 8.7119 213
0.0002 8.7212 214
0.0002 8.7302 215
0.0004 8.7417 216
0.0002 8.7514 217
0.0002 8.7593 218
0.0003 8.7690 219
0.0002 8.7771 220
0.0002 8.7845 221
0.0001 8.7917 222
0.0002 8.7980 223
0.0002 8.8040 224
0.0003 8.8093 225
0.0003 8.8150 226
0.0002 8.8209 227
0.0003 8.8271 228
0.0002 8.8329 229
0.0002 8.8378 230
0.0002 8.8429 231
0.0003 8.8493 232
0.0003 8.8575 233
0.0004 8.8548 234
0.0003 8.8510 235
0.0001 8.8501 236
0.0003 8.8473 237
0.0003 8.8561 238
0.0003 8.8667 239
0.0002 8.8767 240
0.0001 8.8847 241
0.0002 8.8915 242
0.0002 8.8980 243
0.0002 8.9038 244
0.0002 8.9119 245
0.0002 8.9202 246
0.0002 8.9279 247
0.0002 8.9345 248
0.0001 8.9400 249
0.0001 8.9441 250
0.0002 8.9473 251
0.0001 8.9504 252
0.0002 8.9537 253
0.0001 8.9574 254
0.0002 8.9610 255
0.0002 8.9650 256
0.0002 8.9691 257
0.0001 8.9724 258
0.0002 8.9753 259
0.0002 8.9769 260
0.0001 8.9793 261
0.0002 8.9827 262
0.0003 8.9841 263
0.0001 8.9854 264
0.0001 8.9873 265
0.0010 8.9858 266
0.0006 8.9368 267
0.0002 8.9002 268
0.0003 8.8744 269
0.0002 8.8568 270
0.0001 8.8456 271
0.0002 8.8382 272
0.0001 8.8340 273
0.0001 8.8317 274
0.0002 8.8310 275
0.0001 8.8316 276
0.0002 8.8335 277
0.0002 8.8372 278
0.0003 8.8456 279
0.0008 8.9045 280
0.0002 8.9467 281
0.0002 8.9773 282
0.0002 9.0012 283
0.0001 9.0195 284
0.0002 9.0337 285
0.0002 9.0464 286
0.0001 9.0564 287
0.0003 9.0657 288
0.0001 9.0737 289
0.0002 9.0798 290
0.0001 9.0844 291
0.0002 9.0884 292
0.0002 9.0919 293
0.0006 9.0998 294
0.0001 9.1140 295
0.0002 9.1256 296
0.0001 9.1349 297
0.0001 9.1424 298
0.0001 9.1486 299
0.0001 9.1534 300
0.0001 9.1576 301
0.0001 9.1614 302
0.0001 9.1649 303
0.0001 9.1676 304
0.0001 9.1701 305
0.0001 9.1720 306
0.0002 9.1744 307
0.0002 9.1769 308
0.0002 9.1787 309
0.0002 9.1789 310
0.0001 9.1801 311
0.0001 9.1813 312
0.0001 9.1832 313
0.0001 9.1858 314
0.0001 9.1880 315
0.0001 9.1901 316
0.0001 9.1921 317
0.0001 9.1948 318
0.0001 9.1989 319
0.0001 9.2026 320
0.0002 9.2092 321
0.0001 9.2147 322
0.0001 9.2194 323
0.0002 9.2234 324
0.0001 9.2268 325
0.0001 9.2295 326
0.0002 9.2321 327
0.0001 9.2347 328
0.0002 9.2369 329
0.0002 9.2350 330
0.0001 9.2327 331
0.0001 9.2315 332
0.0002 9.2312 333
0.0001 9.2322 334
0.0002 9.2349 335
0.0001 9.2378 336
0.0003 9.2406 337
0.0001 9.2434 338
0.0001 9.2461 339
0.0001 9.2483 340
0.0001 9.2502 341
0.0001 9.2519 342
0.0002 9.2568 343
0.0001 9.2606 344
0.0001 9.2637 345
0.0001 9.2661 346
0.0001 9.2682 347
0.0001 9.2703 348
0.0001 9.2723 349
0.0001 9.2742 350
0.0001 9.2758 351
0.0001 9.2770 352
0.0001 9.2782 353
0.0002 9.2796 354
0.0002 9.2826 355
0.0001 9.2851 356
0.0001 9.2871 357
0.0002 9.2901 358
0.0002 9.2941 359
0.0001 9.2968 360
0.0002 9.2944 361
0.0001 9.2931 362
0.0002 9.2926 363
0.0001 9.2924 364
0.0001 9.2928 365
0.0001 9.2933 366
0.0001 9.2945 367
0.0001 9.2985 368
0.0001 9.3021 369
0.0002 9.3048 370
0.0001 9.3067 371
0.0001 9.3083 372
0.0001 9.3100 373
0.0001 9.3117 374
0.0001 9.3135 375
0.0001 9.3154 376
0.0001 9.3169 377
0.0001 9.3184 378
0.0001 9.3200 379
0.0001 9.3215 380
0.0001 9.3230 381
0.0001 9.3238 382
0.0001 9.3252 383
0.0001 9.3261 384
0.0002 9.3277 385
0.0002 9.3294 386
0.0001 9.3312 387
0.0001 9.3329 388
0.0001 9.3345 389
0.0001 9.3359 390
0.0001 9.3372 391
0.0001 9.3384 392
0.0001 9.3400 393
0.0001 9.3413 394
0.0001 9.3425 395
0.0002 9.3422 396
0.0002 9.3412 397
0.0001 9.3394 398
0.0001 9.3381 399
0.0001 9.3372 400
0.0001 9.3368 401
0.0002 9.3364 402
0.0001 9.3356 403
0.0001 9.3350 404
0.0001 9.3347 405
0.0001 9.3348 406
0.0001 9.3349 407
0.0001 9.3353 408
0.0002 9.3362 409
0.0001 9.3371 410
0.0001 9.3379 411
0.0001 9.3389 412
0.0001 9.3399 413
0.0001 9.3408 414
0.0001 9.3416 415
0.0001 9.3423 416
0.0001 9.3430 417
0.0001 9.3440 418
0.0001 9.3450 419
0.0001 9.3459 420
0.0001 9.3470 421
0.0001 9.3480 422
0.0001 9.3491 423
0.0001 9.3502 424
0.0001 9.3509 425
0.0001 9.3517 426
0.0001 9.3526 427
0.0001 9.3537 428
0.0001 9.3550 429
0.0001 9.3560 430
0.0001 9.3571 431
0.0001 9.3579 432
0.0001 9.3585 433
0.0001 9.3593 434
0.0003 9.3602 435
0.0002 9.3611 436
0.0001 9.3619 437
0.0001 9.3629 438
0.0001 9.3641 439
0.0001 9.3650 440
0.0001 9.3659 441
0.0001 9.3666 442
0.0001 9.3673 443
0.0001 9.3679 444
0.0004 9.3693 445
0.0001 9.3704 446
0.0002 9.3719 447
0.0001 9.3737 448
0.0001 9.3753 449
0.0001 9.3764 450
0.0001 9.3772 451
0.0001 9.3780 452
0.0001 9.3787 453
0.0001 9.3792 454
0.0001 9.3798 455
0.0080 9.3369 456
0.0001 9.2235 457
0.0001 9.1362 458
0.0001 9.0715 459
0.0001 9.0246 460
0.0003 8.9910 461
0.0001 8.9672 462
0.0001 8.9506 463
0.0001 8.9390 464
0.0001 8.9310 465
0.0001 8.9257 466
0.0001 8.9222 467
0.0001 8.9199 468
0.0001 8.9187 469
0.0002 8.9186 470
0.0001 8.9191 471
0.0001 8.9196 472
0.0001 8.9201 473
0.0004 8.9214 474
0.0001 8.9236 475
0.0002 8.9255 476
0.0002 8.9280 477
0.0001 8.9305 478
0.0001 8.9323 479
0.0001 8.9337 480
0.0001 8.9347 481
0.0001 8.9356 482
0.0002 8.9365 483
0.0001 8.9374 484
0.0001 8.9380 485
0.0001 8.9386 486
0.0002 8.9392 487
0.0001 8.9397 488
0.0001 8.9402 489
0.0002 8.9406 490
0.0002 8.9411 491
0.0001 8.9416 492

Framework versions

  • Transformers 4.44.2
  • TensorFlow 2.17.0
  • Datasets 3.0.1
  • Tokenizers 0.19.1
Downloads last month
8
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Labira/LabiraPJOK_1_500

Finetuned
(367)
this model