End of training
Browse files
README.md
ADDED
@@ -0,0 +1,101 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
base_model: roneneldan/TinyStories-33M
|
3 |
+
library_name: Distily
|
4 |
+
tags:
|
5 |
+
- generated_from_trainer
|
6 |
+
model-index:
|
7 |
+
- name: distily_TinyStories-33M_freeze_emb
|
8 |
+
results: []
|
9 |
+
---
|
10 |
+
|
11 |
+
# distily_TinyStories-33M
|
12 |
+
|
13 |
+
This student model is distilled from the teacher model [roneneldan/TinyStories-33M](https://huggingface.co/roneneldan/TinyStories-33M) using the dataset (unspecified).
|
14 |
+
|
15 |
+
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
16 |
+
|
17 |
+
It achieves the following results on the evaluation set:
|
18 |
+
- eval_enwikippl: 86.0272
|
19 |
+
- eval_frwikippl: 9172.2910
|
20 |
+
- eval_zhwikippl: 31986.0898
|
21 |
+
- eval_loss: 0.9611
|
22 |
+
- eval_runtime: 27.2508
|
23 |
+
- eval_samples_per_second: 91.741
|
24 |
+
- eval_steps_per_second: 11.486
|
25 |
+
|
26 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
27 |
+
should probably proofread and complete it, then remove this comment.
|
28 |
+
|
29 |
+
## Model description
|
30 |
+
|
31 |
+
More information needed
|
32 |
+
|
33 |
+
## Intended uses & limitations
|
34 |
+
|
35 |
+
More information needed
|
36 |
+
|
37 |
+
## Training and evaluation data
|
38 |
+
|
39 |
+
More information needed
|
40 |
+
-->
|
41 |
+
|
42 |
+
## Training procedure
|
43 |
+
|
44 |
+
### Training hyperparameters
|
45 |
+
|
46 |
+
The following hyperparameters were used during training:
|
47 |
+
- distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=5000.0, loss_fn=mse, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=500.0, loss_fn=jsd, layer_mapper=None, projector=None))
|
48 |
+
- train_embeddings: True
|
49 |
+
- learning_rate: 4e-05
|
50 |
+
- train_batch_size: 8
|
51 |
+
- eval_batch_size: 8
|
52 |
+
- seed: 42
|
53 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
54 |
+
- lr_scheduler_type: constant
|
55 |
+
- num_epochs: 1.0
|
56 |
+
|
57 |
+
### Resource Usage
|
58 |
+
Peak GPU Memory: 8.2940 GB
|
59 |
+
|
60 |
+
### Eval-Phase Metrics
|
61 |
+
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
62 |
+
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
63 |
+
| **teacher eval** | | 174.1653 | 48148.2734 | | | | | 4930.5806 |
|
64 |
+
| 0 | 0 | 42788.8555 | 63779.7148 | 13.4382 | 27.2438 | 91.764 | 11.489 | 57958.3359 |
|
65 |
+
| 1000 | 0.0323 | 176.2009 | 44333.6016 | 1.6774 | 27.3162 | 91.521 | 11.458 | 457143.6562 |
|
66 |
+
| 2000 | 0.0646 | 128.0956 | 24798.3691 | 1.5142 | 27.266 | 91.689 | 11.48 | 119591.0781 |
|
67 |
+
| 3000 | 0.0970 | 109.3945 | 15041.4014 | 1.3719 | 27.4573 | 91.051 | 11.4 | 68749.8828 |
|
68 |
+
| 4000 | 0.1293 | 103.1060 | 11736.2949 | 1.2548 | 27.2393 | 91.779 | 11.491 | 52875.8438 |
|
69 |
+
| 5000 | 0.1616 | 112.2423 | 11673.6494 | 1.1644 | 27.3226 | 91.499 | 11.456 | 45928.1172 |
|
70 |
+
| 6000 | 0.1939 | 98.1303 | 11178.0225 | 1.0962 | 27.294 | 91.595 | 11.468 | 43252.2148 |
|
71 |
+
| 7000 | 0.2263 | 93.0121 | 9680.7031 | 1.0394 | 27.3697 | 91.342 | 11.436 | 36992.1562 |
|
72 |
+
| 8000 | 0.2586 | 90.4050 | 9906.2393 | 1.0005 | 27.424 | 91.161 | 11.413 | 34836.8906 |
|
73 |
+
| 9000 | 0.2909 | 86.0272 | 9172.2910 | 0.9611 | 27.2508 | 91.741 | 11.486 | 31986.0898 |
|
74 |
+
| 10000 | 0.3232 | 86.3193 | 8911.2168 | 0.9344 | 27.4195 | 91.176 | 11.415 | 33114.9648 |
|
75 |
+
| 11000 | 0.3555 | 85.1883 | 9004.5898 | 0.9170 | 27.6131 | 90.537 | 11.335 | 28466.0332 |
|
76 |
+
| 12000 | 0.3879 | 82.4485 | 8789.0557 | 0.8952 | 27.5622 | 90.704 | 11.356 | 26171.4727 |
|
77 |
+
| 13000 | 0.4202 | 86.4648 | 11200.8799 | 0.8819 | 27.2915 | 91.603 | 11.469 | 28254.1816 |
|
78 |
+
| 14000 | 0.4525 | 83.4509 | 8846.1875 | 0.8756 | 27.288 | 91.615 | 11.47 | 24126.1836 |
|
79 |
+
| 15000 | 0.4848 | 83.4380 | 8696.6904 | 0.8562 | 27.2967 | 91.586 | 11.467 | 22347.7852 |
|
80 |
+
| 16000 | 0.5172 | 84.3804 | 9052.9209 | 0.8506 | 27.5838 | 90.633 | 11.347 | 26039.1504 |
|
81 |
+
| 17000 | 0.5495 | 92.4088 | 9267.0918 | 0.8451 | 27.2622 | 91.702 | 11.481 | 24745.4961 |
|
82 |
+
| 18000 | 0.5818 | 92.4374 | 9366.8291 | 0.8401 | 27.5177 | 90.851 | 11.375 | 23503.5566 |
|
83 |
+
| 19000 | 0.6141 | 87.0512 | 8318.6689 | 0.8306 | 27.185 | 91.963 | 11.514 | 23050.2109 |
|
84 |
+
| 20000 | 0.6465 | 93.4635 | 10036.1631 | 0.8266 | 27.3179 | 91.515 | 11.458 | 26122.6484 |
|
85 |
+
| 21000 | 0.6788 | 82.3464 | 9078.4600 | 0.8196 | 27.3629 | 91.365 | 11.439 | 28156.3516 |
|
86 |
+
| 22000 | 0.7111 | 81.6666 | 9332.5889 | 0.8155 | 27.6142 | 90.533 | 11.335 | 32020.2734 |
|
87 |
+
| 23000 | 0.7434 | 84.7325 | 9831.8672 | 0.8086 | 27.2205 | 91.843 | 11.499 | 33488.1289 |
|
88 |
+
| 24000 | 0.7757 | 81.2596 | 8868.6484 | 0.8074 | 27.307 | 91.552 | 11.462 | 30275.5918 |
|
89 |
+
| 25000 | 0.8081 | 81.1778 | 8258.5459 | 0.8051 | 27.3489 | 91.411 | 11.445 | 26269.4199 |
|
90 |
+
| 26000 | 0.8404 | 84.4753 | 9221.5127 | 0.8007 | 27.3172 | 91.517 | 11.458 | 31739.5938 |
|
91 |
+
| 27000 | 0.8727 | 81.3541 | 9123.3232 | 0.7995 | 27.2848 | 91.626 | 11.472 | 36992.1562 |
|
92 |
+
| 28000 | 0.9050 | 85.5785 | 9260.5635 | 0.7973 | 27.1686 | 92.018 | 11.521 | 34531.5234 |
|
93 |
+
| 29000 | 0.9374 | 92.4553 | 8333.3262 | 0.7944 | 27.2956 | 91.59 | 11.467 | 41878.25 |
|
94 |
+
| 30000 | 0.9697 | 92.4625 | 8644.1758 | 0.7925 | 27.2757 | 91.657 | 11.475 | 49319.1836 |
|
95 |
+
| 30938 | 1.0 | 91.8841 | 8440.8330 | 0.7884 | 27.314 | 91.528 | 11.459 | 49928.1523 |
|
96 |
+
|
97 |
+
### Framework versions
|
98 |
+
- Distily 0.2.0
|
99 |
+
- Transformers 4.44.0
|
100 |
+
- Pytorch 2.3.0
|
101 |
+
- Datasets 2.21.0
|
generation_config.json
ADDED
@@ -0,0 +1,6 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_from_model_config": true,
|
3 |
+
"bos_token_id": 50256,
|
4 |
+
"eos_token_id": 50256,
|
5 |
+
"transformers_version": "4.44.0"
|
6 |
+
}
|
runs/Aug15_18-26-48_b7d545513dcf/events.out.tfevents.1723752881.b7d545513dcf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:f01d3c22fce1fa51156c0f6aed66bf8269a169df9309dbed94d179092cb3dff7
|
3 |
+
size 253
|