File size: 2,063 Bytes
51d7567
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
---
library_name: transformers
tags:
- generated_from_trainer
model-index:
- name: pints_paged_adamw_32bit_warmup0.02
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# pints_paged_adamw_32bit_warmup0.02

This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 7.4135

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 2048
- total_train_batch_size: 2048
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 1

### Training results

| Training Loss | Epoch  | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 10.8665       | 0.0020 | 208  | 10.5405         |
| 9.771         | 0.0040 | 416  | 9.3305          |
| 8.9825        | 0.0060 | 624  | 8.5321          |
| 8.2362        | 0.0080 | 832  | 7.9837          |
| 7.8471        | 0.0100 | 1040 | 7.6779          |
| 7.5529        | 0.0120 | 1248 | 7.4253          |
| 7.3361        | 0.0140 | 1456 | 7.2233          |
| 7.137         | 0.0160 | 1664 | 7.0466          |
| 7.0123        | 0.0180 | 1872 | 6.9768          |
| 6.9564        | 0.0200 | 2080 | 6.9193          |
| 6.9615        | 0.0220 | 2288 | 6.9234          |
| 6.9531        | 0.0240 | 2496 | 6.9235          |
| 6.9675        | 0.0260 | 2704 | 6.9571          |
| 6.9392        | 0.0280 | 2912 | 6.9076          |
| 7.3212        | 0.9604 | 3120 | 7.4135          |


### Framework versions

- Transformers 4.44.2
- Pytorch 2.3.0+cu121
- Datasets 2.21.0
- Tokenizers 0.19.1