Edit model card

t5-base-sede-txt2sql

This model is a fine-tuned version of google/t5-v1_1-base on the sede dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1577
  • Bleu Score: 0.5923
  • Parsable Queries Accuracy: 0.0
  • Partial Match F1: 0.0
  • Partial Match F1 No Values: 0.0
  • Partial Match Em: 0.0
  • Partial Match No Values Em: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Bleu Score Parsable Queries Accuracy Partial Match F1 Partial Match F1 No Values Partial Match Em Partial Match No Values Em
No log 1.0 95 13.2410 0.0069 0.0 0.0 0.0 0.0 0.0
No log 2.0 190 7.6317 0.0134 0.0 0.0 0.0 0.0 0.0
No log 3.0 285 6.0919 0.0058 0.0 0.0 0.0 0.0 0.0
No log 4.0 380 5.4922 0.0021 0.0 0.0 0.0 0.0 0.0
No log 5.0 475 4.7151 0.0009 0.0 0.0 0.0 0.0 0.0
12.0698 6.0 570 4.1412 0.0003 0.0 0.0 0.0 0.0 0.0
12.0698 7.0 665 3.6398 0.0003 0.0 0.0 0.0 0.0 0.0
12.0698 8.0 760 3.2643 0.0009 0.0 0.0 0.0 0.0 0.0
12.0698 9.0 855 3.0544 0.0013 0.0 0.0 0.0 0.0 0.0
12.0698 10.0 950 2.8015 0.0043 0.0 0.0 0.0 0.0 0.0
4.696 11.0 1045 2.5552 0.0789 0.0 0.0 0.0 0.0 0.0
4.696 12.0 1140 2.3535 0.1036 0.0 0.0 0.0 0.0 0.0
4.696 13.0 1235 2.2132 0.0050 0.0 0.0 0.0 0.0 0.0
4.696 14.0 1330 2.1084 0.1333 0.0 0.0 0.0 0.0 0.0
4.696 15.0 1425 2.0117 0.2972 0.0 0.0 0.0 0.0 0.0
3.1348 16.0 1520 1.9333 0.2481 0.0 0.0 0.0 0.0 0.0
3.1348 17.0 1615 1.8395 0.4149 0.0 0.0 0.0 0.0 0.0
3.1348 18.0 1710 1.7661 0.5439 0.0 0.0 0.0 0.0 0.0
3.1348 19.0 1805 1.7101 0.6001 0.0 0.0 0.0 0.0 0.0
3.1348 20.0 1900 1.6562 0.6219 0.0 0.0 0.0 0.0 0.0
3.1348 21.0 1995 1.6073 0.5865 0.0 0.0 0.0 0.0 0.0
2.4276 22.0 2090 1.5773 0.5683 0.0 0.0 0.0 0.0 0.0
2.4276 23.0 2185 1.5478 0.5408 0.0 0.0 0.0 0.0 0.0
2.4276 24.0 2280 1.5190 0.5749 0.0 0.0 0.0 0.0 0.0
2.4276 25.0 2375 1.4927 0.5818 0.0 0.0 0.0 0.0 0.0
2.4276 26.0 2470 1.4671 0.5673 0.0 0.0 0.0 0.0 0.0
2.076 27.0 2565 1.4499 0.5616 0.0 0.0 0.0 0.0 0.0
2.076 28.0 2660 1.4275 0.6041 0.0 0.0 0.0 0.0 0.0
2.076 29.0 2755 1.4096 0.5764 0.0 0.0 0.0 0.0 0.0
2.076 30.0 2850 1.3983 0.5862 0.0 0.0 0.0 0.0 0.0
2.076 31.0 2945 1.3812 0.5982 0.0 0.0 0.0 0.0 0.0
1.8828 32.0 3040 1.3679 0.5927 0.0 0.0 0.0 0.0 0.0
1.8828 33.0 3135 1.3548 0.5916 0.0 0.0 0.0 0.0 0.0
1.8828 34.0 3230 1.3461 0.5769 0.0 0.0 0.0 0.0 0.0
1.8828 35.0 3325 1.3353 0.5871 0.0 0.0 0.0 0.0 0.0
1.8828 36.0 3420 1.3293 0.5687 0.0 0.0 0.0 0.0 0.0
1.7602 37.0 3515 1.3195 0.5689 0.0 0.0 0.0 0.0 0.0
1.7602 38.0 3610 1.3109 0.5949 0.0 0.0 0.0 0.0 0.0
1.7602 39.0 3705 1.3049 0.5619 0.0 0.0 0.0 0.0 0.0
1.7602 40.0 3800 1.2953 0.5872 0.0 0.0 0.0 0.0 0.0
1.7602 41.0 3895 1.2907 0.6014 0.0 0.0 0.0 0.0 0.0
1.7602 42.0 3990 1.2831 0.5917 0.0 0.0 0.0 0.0 0.0
1.6652 43.0 4085 1.2757 0.5718 0.0 0.0 0.0 0.0 0.0
1.6652 44.0 4180 1.2692 0.5707 0.0 0.0 0.0 0.0 0.0
1.6652 45.0 4275 1.2642 0.5758 0.0 0.0 0.0 0.0 0.0
1.6652 46.0 4370 1.2619 0.6012 0.0 0.0 0.0 0.0 0.0
1.6652 47.0 4465 1.2527 0.5749 0.0 0.0 0.0 0.0 0.0
1.6009 48.0 4560 1.2496 0.5722 0.0 0.0 0.0 0.0 0.0
1.6009 49.0 4655 1.2447 0.5633 0.0 0.0 0.0 0.0 0.0
1.6009 50.0 4750 1.2411 0.5615 0.0 0.0 0.0 0.0 0.0
1.6009 51.0 4845 1.2356 0.5691 0.0 0.0 0.0 0.0 0.0
1.6009 52.0 4940 1.2322 0.5636 0.0 0.0 0.0 0.0 0.0
1.5481 53.0 5035 1.2285 0.5724 0.0 0.0 0.0 0.0 0.0
1.5481 54.0 5130 1.2255 0.5771 0.0 0.0 0.0 0.0 0.0
1.5481 55.0 5225 1.2201 0.5827 0.0 0.0 0.0 0.0 0.0
1.5481 56.0 5320 1.2181 0.5928 0.0 0.0 0.0 0.0 0.0
1.5481 57.0 5415 1.2152 0.5599 0.0 0.0 0.0 0.0 0.0
1.5082 58.0 5510 1.2123 0.5779 0.0 0.0 0.0 0.0 0.0
1.5082 59.0 5605 1.2083 0.5609 0.0 0.0 0.0 0.0 0.0
1.5082 60.0 5700 1.2070 0.5654 0.0 0.0 0.0 0.0 0.0
1.5082 61.0 5795 1.2036 0.5566 0.0 0.0 0.0 0.0 0.0
1.5082 62.0 5890 1.2011 0.5569 0.0 0.0 0.0 0.0 0.0
1.5082 63.0 5985 1.1993 0.5567 0.0 0.0 0.0 0.0 0.0
1.4799 64.0 6080 1.1958 0.5619 0.0 0.0 0.0 0.0 0.0
1.4799 65.0 6175 1.1950 0.5691 0.0 0.0 0.0 0.0 0.0
1.4799 66.0 6270 1.1914 0.5572 0.0 0.0 0.0 0.0 0.0
1.4799 67.0 6365 1.1879 0.5635 0.0 0.0 0.0 0.0 0.0
1.4799 68.0 6460 1.1866 0.5654 0.0 0.0 0.0 0.0 0.0
1.4475 69.0 6555 1.1850 0.5575 0.0 0.0 0.0 0.0 0.0
1.4475 70.0 6650 1.1833 0.5507 0.0 0.0 0.0 0.0 0.0
1.4475 71.0 6745 1.1820 0.5493 0.0 0.0 0.0 0.0 0.0
1.4475 72.0 6840 1.1786 0.5525 0.0 0.0 0.0 0.0 0.0
1.4475 73.0 6935 1.1789 0.5615 0.0 0.0 0.0 0.0 0.0
1.4233 74.0 7030 1.1770 0.5603 0.0 0.0 0.0 0.0 0.0
1.4233 75.0 7125 1.1749 0.5699 0.0 0.0 0.0 0.0 0.0
1.4233 76.0 7220 1.1754 0.5730 0.0 0.0 0.0 0.0 0.0
1.4233 77.0 7315 1.1735 0.5798 0.0 0.0 0.0 0.0 0.0
1.4233 78.0 7410 1.1716 0.5771 0.0 0.0 0.0 0.0 0.0
1.4101 79.0 7505 1.1699 0.5800 0.0 0.0 0.0 0.0 0.0
1.4101 80.0 7600 1.1675 0.5736 0.0 0.0 0.0 0.0 0.0
1.4101 81.0 7695 1.1661 0.5845 0.0 0.0 0.0 0.0 0.0
1.4101 82.0 7790 1.1659 0.5974 0.0 0.0 0.0 0.0 0.0
1.4101 83.0 7885 1.1664 0.5825 0.0 0.0 0.0 0.0 0.0
1.4101 84.0 7980 1.1647 0.5871 0.0 0.0 0.0 0.0 0.0
1.3965 85.0 8075 1.1639 0.5772 0.0 0.0 0.0 0.0 0.0
1.3965 86.0 8170 1.1628 0.5826 0.0 0.0 0.0 0.0 0.0
1.3965 87.0 8265 1.1615 0.5960 0.0 0.0 0.0 0.0 0.0
1.3965 88.0 8360 1.1616 0.5908 0.0 0.0 0.0 0.0 0.0
1.3965 89.0 8455 1.1613 0.5775 0.0 0.0 0.0 0.0 0.0
1.3835 90.0 8550 1.1604 0.5917 0.0 0.0 0.0 0.0 0.0
1.3835 91.0 8645 1.1597 0.5732 0.0 0.0 0.0 0.0 0.0
1.3835 92.0 8740 1.1594 0.5767 0.0 0.0 0.0 0.0 0.0
1.3835 93.0 8835 1.1584 0.5719 0.0 0.0 0.0 0.0 0.0
1.3835 94.0 8930 1.1581 0.5700 0.0 0.0 0.0 0.0 0.0
1.3766 95.0 9025 1.1583 0.5845 0.0 0.0 0.0 0.0 0.0
1.3766 96.0 9120 1.1578 0.5808 0.0 0.0 0.0 0.0 0.0
1.3766 97.0 9215 1.1578 0.5889 0.0 0.0 0.0 0.0 0.0
1.3766 98.0 9310 1.1577 0.5851 0.0 0.0 0.0 0.0 0.0
1.3766 99.0 9405 1.1578 0.5923 0.0 0.0 0.0 0.0 0.0
1.3726 100.0 9500 1.1577 0.5923 0.0 0.0 0.0 0.0 0.0

Framework versions

  • Transformers 4.18.0
  • Pytorch 1.11.0+cu113
  • Datasets 2.1.0
  • Tokenizers 0.12.1
Downloads last month
29
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train chainyo/t5-base-sede-txt2sql