File size: 3,943 Bytes
0513381
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
---
base_model: d0rj/rut5-base-summ
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: myspace1
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# myspace1

This model is a fine-tuned version of [d0rj/rut5-base-summ](https://huggingface.co/d0rj/rut5-base-summ) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3282
- Rouge1: 0.242
- Rouge2: 0.1107
- Rougel: 0.2373
- Rougelsum: 0.2351
- Gen Len: 55.65

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 25
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
| No log        | 1.0   | 90   | 2.3719          | 0.2088 | 0.0817 | 0.2064 | 0.2072    | 39.99   |
| No log        | 2.0   | 180  | 2.3539          | 0.2393 | 0.1057 | 0.2367 | 0.2363    | 42.87   |
| No log        | 3.0   | 270  | 2.3378          | 0.2249 | 0.0893 | 0.2194 | 0.2187    | 46.75   |
| No log        | 4.0   | 360  | 2.3271          | 0.2263 | 0.0935 | 0.2199 | 0.2195    | 49.99   |
| No log        | 5.0   | 450  | 2.3220          | 0.2412 | 0.1001 | 0.2318 | 0.2328    | 53.65   |
| 1.7281        | 6.0   | 540  | 2.3206          | 0.2305 | 0.0978 | 0.2238 | 0.223     | 55.28   |
| 1.7281        | 7.0   | 630  | 2.3194          | 0.2338 | 0.1044 | 0.2276 | 0.2274    | 55.01   |
| 1.7281        | 8.0   | 720  | 2.3197          | 0.2449 | 0.1085 | 0.2383 | 0.237     | 55.42   |
| 1.7281        | 9.0   | 810  | 2.3201          | 0.2526 | 0.1114 | 0.2481 | 0.2455    | 56.34   |
| 1.7281        | 10.0  | 900  | 2.3204          | 0.238  | 0.103  | 0.2331 | 0.2302    | 55.9    |
| 1.7281        | 11.0  | 990  | 2.3214          | 0.2372 | 0.1133 | 0.2334 | 0.231     | 55.46   |
| 1.4551        | 12.0  | 1080 | 2.3220          | 0.2418 | 0.1158 | 0.2361 | 0.2352    | 56.44   |
| 1.4551        | 13.0  | 1170 | 2.3229          | 0.25   | 0.1209 | 0.2454 | 0.2433    | 55.8    |
| 1.4551        | 14.0  | 1260 | 2.3240          | 0.2507 | 0.124  | 0.2465 | 0.2448    | 55.09   |
| 1.4551        | 15.0  | 1350 | 2.3247          | 0.2561 | 0.1247 | 0.2505 | 0.2491    | 54.39   |
| 1.4551        | 16.0  | 1440 | 2.3256          | 0.2452 | 0.1198 | 0.2396 | 0.2379    | 53.75   |
| 1.3726        | 17.0  | 1530 | 2.3258          | 0.2367 | 0.1137 | 0.2305 | 0.2285    | 54.84   |
| 1.3726        | 18.0  | 1620 | 2.3265          | 0.2403 | 0.1159 | 0.2349 | 0.2329    | 54.56   |
| 1.3726        | 19.0  | 1710 | 2.3264          | 0.2381 | 0.1132 | 0.2335 | 0.2303    | 55.01   |
| 1.3726        | 20.0  | 1800 | 2.3270          | 0.2418 | 0.1133 | 0.2371 | 0.2346    | 55.21   |
| 1.3726        | 21.0  | 1890 | 2.3273          | 0.2413 | 0.1133 | 0.2368 | 0.234     | 55.84   |
| 1.3726        | 22.0  | 1980 | 2.3275          | 0.2431 | 0.1137 | 0.2388 | 0.2367    | 55.82   |
| 1.3286        | 23.0  | 2070 | 2.3277          | 0.2424 | 0.1106 | 0.2376 | 0.2354    | 56.05   |
| 1.3286        | 24.0  | 2160 | 2.3280          | 0.242  | 0.1107 | 0.2373 | 0.2351    | 55.87   |
| 1.3286        | 25.0  | 2250 | 2.3282          | 0.242  | 0.1107 | 0.2373 | 0.2351    | 55.65   |


### Framework versions

- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0