File size: 6,591 Bytes
64f60c4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
---
base_model: Qwen/Qwen2-7B
library_name: peft
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: legal_qwen
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# legal_qwen

This model is a fine-tuned version of [Qwen/Qwen2-7B](https://huggingface.co/Qwen/Qwen2-7B) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5566
- Law Precision: 0.4434
- Law Recall: 0.6351
- Law F1: 0.5222
- Law Number: 74
- Violated by Precision: 0.3663
- Violated by Recall: 0.5139
- Violated by F1: 0.4277
- Violated by Number: 72
- Violated on Precision: 0.1507
- Violated on Recall: 0.2245
- Violated on F1: 0.1803
- Violated on Number: 49
- Violation Precision: 0.3132
- Violation Recall: 0.4430
- Violation F1: 0.3669
- Violation Number: 596
- Overall Precision: 0.3197
- Overall Recall: 0.4539
- Overall F1: 0.3751
- Overall Accuracy: 0.9062

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10

### Training results

| Training Loss | Epoch | Step | Validation Loss | Law Precision | Law Recall | Law F1 | Law Number | Violated by Precision | Violated by Recall | Violated by F1 | Violated by Number | Violated on Precision | Violated on Recall | Violated on F1 | Violated on Number | Violation Precision | Violation Recall | Violation F1 | Violation Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:-------------:|:----------:|:------:|:----------:|:---------------------:|:------------------:|:--------------:|:------------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:-------------------:|:----------------:|:------------:|:----------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| No log        | 1.0   | 45   | 0.6876          | 0.0238        | 0.0135     | 0.0172 | 74         | 0.0                   | 0.0                | 0.0            | 72                 | 0.0                   | 0.0                | 0.0            | 49                 | 0.0104              | 0.0319           | 0.0156       | 596              | 0.0105            | 0.0253         | 0.0149     | 0.7882           |
| No log        | 2.0   | 90   | 0.4180          | 0.1235        | 0.1351     | 0.1290 | 74         | 0.25                  | 0.1389             | 0.1786         | 72                 | 0.0                   | 0.0                | 0.0            | 49                 | 0.0989              | 0.1846           | 0.1288       | 596              | 0.1052            | 0.1643         | 0.1283     | 0.8636           |
| No log        | 3.0   | 135  | 0.3403          | 0.3647        | 0.4189     | 0.3899 | 74         | 0.3803                | 0.375              | 0.3776         | 72                 | 0.0968                | 0.0612             | 0.075          | 49                 | 0.1694              | 0.2903           | 0.2140       | 596              | 0.1937            | 0.2958         | 0.2341     | 0.8853           |
| No log        | 4.0   | 180  | 0.3404          | 0.4           | 0.4054     | 0.4027 | 74         | 0.3721                | 0.4444             | 0.4051         | 72                 | 0.1471                | 0.2041             | 0.1709         | 49                 | 0.2298              | 0.3339           | 0.2722       | 596              | 0.2475            | 0.3426         | 0.2874     | 0.8974           |
| No log        | 5.0   | 225  | 0.3894          | 0.4545        | 0.5405     | 0.4938 | 74         | 0.4022                | 0.5139             | 0.4512         | 72                 | 0.1587                | 0.2041             | 0.1786         | 49                 | 0.2725              | 0.3859           | 0.3194       | 596              | 0.2916            | 0.4008         | 0.3376     | 0.9016           |
| No log        | 6.0   | 270  | 0.3922          | 0.4565        | 0.5676     | 0.5060 | 74         | 0.4118                | 0.4861             | 0.4459         | 72                 | 0.2051                | 0.1633             | 0.1818         | 49                 | 0.3081              | 0.4849           | 0.3768       | 596              | 0.3241            | 0.4728         | 0.3846     | 0.9070           |
| No log        | 7.0   | 315  | 0.4751          | 0.4819        | 0.5405     | 0.5096 | 74         | 0.3953                | 0.4722             | 0.4304         | 72                 | 0.1525                | 0.1837             | 0.1667         | 49                 | 0.2579              | 0.3708           | 0.3042       | 596              | 0.2802            | 0.3843         | 0.3241     | 0.9011           |
| No log        | 8.0   | 360  | 0.4684          | 0.4257        | 0.5811     | 0.4914 | 74         | 0.4                   | 0.5278             | 0.4551         | 72                 | 0.1429                | 0.2041             | 0.1681         | 49                 | 0.3025              | 0.4446           | 0.3601       | 596              | 0.3117            | 0.4501         | 0.3683     | 0.9063           |
| No log        | 9.0   | 405  | 0.5157          | 0.4554        | 0.6216     | 0.5257 | 74         | 0.4211                | 0.5556             | 0.4790         | 72                 | 0.1515                | 0.2041             | 0.1739         | 49                 | 0.2969              | 0.4329           | 0.3522       | 596              | 0.3130            | 0.4475         | 0.3684     | 0.9071           |
| No log        | 10.0  | 450  | 0.5566          | 0.4434        | 0.6351     | 0.5222 | 74         | 0.3663                | 0.5139             | 0.4277         | 72                 | 0.1507                | 0.2245             | 0.1803         | 49                 | 0.3132              | 0.4430           | 0.3669       | 596              | 0.3197            | 0.4539         | 0.3751     | 0.9062           |


### Framework versions

- PEFT 0.12.0
- Transformers 4.44.0
- Pytorch 2.4.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1