File size: 2,401 Bytes
5b10a21 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 |
---
tags:
- generated_from_trainer
model-index:
- name: legal-french-roberta-base
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# legal-french-roberta-base
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4293
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- distributed_type: tpu
- num_devices: 8
- gradient_accumulation_steps: 4
- total_train_batch_size: 512
- total_eval_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.05
- training_steps: 1000000
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-------:|:---------------:|
| 0.8649 | 0.05 | 50000 | 0.7819 |
| 0.7852 | 0.1 | 100000 | 0.6027 |
| 0.5898 | 1.02 | 150000 | 0.5842 |
| 0.6136 | 1.07 | 200000 | 0.5343 |
| 0.6135 | 1.12 | 250000 | 0.5461 |
| 0.5804 | 2.03 | 300000 | 0.5295 |
| 0.5602 | 2.08 | 350000 | 0.5120 |
| 0.5446 | 2.13 | 400000 | 0.4904 |
| 0.5414 | 3.05 | 450000 | 0.4853 |
| 0.5765 | 3.1 | 500000 | 0.4788 |
| 0.6903 | 4.01 | 550000 | 0.4597 |
| 0.6149 | 4.06 | 600000 | 0.4556 |
| 0.5649 | 4.11 | 650000 | 0.4543 |
| 0.6449 | 5.03 | 700000 | 0.4489 |
| 0.6425 | 5.08 | 750000 | 0.4386 |
| 0.6263 | 5.13 | 800000 | 0.4344 |
| 0.6035 | 6.05 | 850000 | 0.4317 |
| 0.607 | 6.1 | 900000 | 0.4332 |
| 0.5899 | 7.01 | 950000 | 0.4321 |
| 0.5751 | 7.06 | 1000000 | 0.4293 |
### Framework versions
- Transformers 4.20.1
- Pytorch 1.12.0+cu102
- Datasets 2.8.0
- Tokenizers 0.12.1
|