LegalLMs
Collection
XLM-RoBERTa models with continued pretraining on the MultiLegalPile
•
37 items
•
Updated
•
2
This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Accuracy | Validation Loss |
---|---|---|---|---|
2.1008 | 0.05 | 50000 | 0.6533 | 2.0523 |
1.5248 | 0.1 | 100000 | 0.7661 | 1.1575 |
1.3152 | 0.15 | 150000 | 0.7674 | 1.1281 |
1.1239 | 0.2 | 200000 | 0.7971 | 0.9458 |
0.9472 | 0.25 | 250000 | 0.7876 | 0.9979 |
0.961 | 0.3 | 300000 | 0.8075 | 0.8798 |
1.0179 | 0.35 | 350000 | 0.8018 | 0.9102 |
1.037 | 0.4 | 400000 | 0.8195 | 0.8107 |
1.1206 | 0.45 | 450000 | 0.8152 | 0.8323 |
1.0865 | 0.5 | 500000 | 0.8242 | 0.7829 |
0.9616 | 0.55 | 550000 | 0.8224 | 0.7895 |
0.7727 | 0.6 | 600000 | 0.8285 | 0.7585 |
0.9871 | 1.04 | 650000 | 0.8320 | 0.7391 |
1.0679 | 1.09 | 700000 | 0.8311 | 0.7436 |
0.9203 | 1.14 | 750000 | 0.8355 | 0.7187 |
0.9626 | 1.19 | 800000 | 0.8353 | 0.7242 |
0.7263 | 1.24 | 850000 | 0.7094 | 0.8378 |
0.8578 | 1.29 | 900000 | 0.7140 | 0.8368 |
0.7693 | 1.34 | 950000 | 0.7091 | 0.8377 |
1.0488 | 1.39 | 1000000 | 0.7080 | 0.8387 |