training_bert
This model is a fine-tuned version of Bert Base Uncased on dataset composed of different jobs posted in several job platforms and thousands of resumes. It achieves the following results on the evaluation set:
- Loss: 4.0495
Model description
Pretraining done on bert base architecture.
Intended uses & limitations
This model can be used to generate contextual embeddings for textual data used in Applicant Tracking Systems such as resumes, jobs and cover letters. The embeddings can be further used to perform other NLP downstream tasks such as classification, Named Entity Recognition and so on.
Training and evaluation data
THe training corpus is developed using about 40000 resumes and 2000 jobs posted scrapped from different job portals. This is a preliminary dataset for the experimentation. THe corpus size is about 2.35 GB of textual data. Similary evaluation data contains few resumes and jobs making about 12 mb of textual data.
Training procedure
For the pretraining of masked language model, Trainer API from Huggingface is used. The pretraining took about 6 hrs 40 mins.
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
6.7862 | 0.11 | 500 | 6.9461 |
5.9428 | 0.22 | 1000 | 6.4640 |
5.5463 | 0.33 | 1500 | 6.2736 |
5.1871 | 0.44 | 2000 | 5.8517 |
4.896 | 0.55 | 2500 | 5.6070 |
4.6557 | 0.66 | 3000 | 5.4669 |
4.4832 | 0.77 | 3500 | 5.3318 |
4.3368 | 0.88 | 4000 | 5.2414 |
4.1887 | 0.99 | 4500 | 5.0666 |
4.053 | 1.1 | 5000 | 4.9532 |
3.9653 | 1.21 | 5500 | 4.8288 |
3.8865 | 1.33 | 6000 | 4.6741 |
3.8294 | 1.44 | 6500 | 4.7943 |
3.7565 | 1.55 | 7000 | 4.7336 |
3.673 | 1.66 | 7500 | 4.4760 |
3.6447 | 1.77 | 8000 | 4.5856 |
3.5808 | 1.88 | 8500 | 4.6133 |
3.5329 | 1.99 | 9000 | 4.4766 |
3.4916 | 2.1 | 9500 | 4.5085 |
3.4392 | 2.21 | 10000 | 4.5306 |
3.4333 | 2.32 | 10500 | 4.5433 |
3.3905 | 2.43 | 11000 | 4.1829 |
3.3701 | 2.54 | 11500 | 4.2976 |
3.3345 | 2.65 | 12000 | 4.2817 |
3.2815 | 2.76 | 12500 | 4.3146 |
3.2689 | 2.87 | 13000 | 4.2634 |
3.2401 | 2.98 | 13500 | 4.0907 |
3.2068 | 3.09 | 14000 | 4.1130 |
3.2097 | 3.2 | 14500 | 4.2001 |
3.1627 | 3.31 | 15000 | 4.0852 |
3.1647 | 3.42 | 15500 | 4.0383 |
3.1294 | 3.53 | 16000 | 3.9377 |
3.1166 | 3.64 | 16500 | 4.0733 |
3.1028 | 3.75 | 17000 | 3.8429 |
3.0903 | 3.86 | 17500 | 4.1127 |
3.0877 | 3.98 | 18000 | 3.8605 |
3.0407 | 4.09 | 18500 | 3.8482 |
3.0452 | 4.2 | 19000 | 4.0345 |
3.0496 | 4.31 | 19500 | 3.8602 |
3.0229 | 4.42 | 20000 | 4.2268 |
3.0157 | 4.53 | 20500 | 3.8028 |
3.0037 | 4.64 | 21000 | 3.8668 |
2.9992 | 4.75 | 21500 | 3.9542 |
3.016 | 4.86 | 22000 | 3.9090 |
2.9804 | 4.97 | 22500 | 4.0495 |
Framework versions
- Transformers 4.25.1
- Pytorch 1.8.0+cu111
- Datasets 2.7.1
- Tokenizers 0.13.2
- Downloads last month
- 22