alex-levashov commited on
Commit
22a276a
1 Parent(s): 18a5b52

End of training

Browse files
README.md CHANGED
@@ -1,5 +1,6 @@
1
  ---
2
  license: other
 
3
  tags:
4
  - generated_from_trainer
5
  datasets:
@@ -16,12 +17,17 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the scene_parse_150 dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 3.1876
20
- - Mean Iou: 0.0552
21
- - Mean Accuracy: 0.1069
22
- - Overall Accuracy: 0.3425
23
- - Per Category Iou: [0.3214630828545835, 0.20601478596854178, 0.0, 0.27354650591173274, 0.8417970372421432, 0.31385557540188824, 0.0, 0.01546660166853167, 0.22122126594808622, 0.0, 0.0, nan, 0.0, nan, 0.0, 2.2945779123930152e-05, 0.0, 0.0, 0.0, 0.0, 0.7301226048014096, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0]
24
- - Per Category Accuracy: [0.6886871880793561, 0.21227660902890075, nan, 0.6349079054604727, 0.9073991165234002, 0.4336075205640423, 0.0, 0.7316561844863732, 0.526753854715684, nan, 0.0, nan, 0.0, nan, 0.0, 2.4867580135776987e-05, nan, 0.0, 0.0, 0.0, 0.9956150888995675, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0]
 
 
 
 
 
25
 
26
  ## Model description
27
 
@@ -48,20 +54,9 @@ The following hyperparameters were used during training:
48
  - lr_scheduler_type: linear
49
  - num_epochs: 50
50
 
51
- ### Training results
52
-
53
- | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
54
- |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
55
- | 3.3524 | 10.0 | 200 | 3.8180 | 0.0487 | 0.0861 | 0.3106 | [0.28264084958802793, 0.13653014326488203, nan, 0.2426731008687142, 0.5869845712104966, 0.15486259971561986, 0.0, 0.0, 0.12131720003418761, nan, 0.0, nan, 0.1521135572674305, nan, 0.0, 0.003742531845338744, 0.0, 0.0, 0.0, 0.0, 0.804619907601848, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0] | [0.7354283062800898, 0.14108408460969768, nan, 0.6531312143439283, 0.9640639923591213, 0.19255421429334474, 0.0, 0.0, 0.32526925721020067, nan, 0.0, nan, 0.15791182917918326, nan, 0.0, 0.00412801830253898, nan, 0.0, 0.0, 0.0, 0.9589740509370495, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0] |
56
- | 2.2636 | 20.0 | 400 | 3.4040 | 0.0517 | 0.1019 | 0.3658 | [0.33166172632681973, 0.25639742510747676, 0.0, 0.31599466350429883, 0.5561154794429266, 0.35978479196556673, 0.0, 0.0, 0.17355201427598316, 0.0, 0.0, nan, 0.044999799108039695, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8064841803585677, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0] | [0.6326427775083301, 0.2650205202289488, nan, 0.8339168704156479, 0.9963735673352435, 0.5357760922978314, 0.0, 0.0, 0.5921858120273676, nan, 0.0, nan, 0.04546561662742551, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.9916506487265737, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0] |
57
- | 2.2348 | 30.0 | 600 | 3.2677 | 0.0565 | 0.1118 | 0.3566 | [0.3298261859499856, 0.2894205494486437, 0.0, 0.3071140867987943, 0.8268382637402316, 0.29503716045453693, 0.0, 0.016577746627951882, 0.19118079400826032, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7963919599607332, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0] | [0.63567038924337, 0.3051832191288764, nan, 0.7506699266503668, 0.9568708213944603, 0.42089520350389914, 0.0, 0.7593029350104822, 0.5481716698857498, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.9908497517219286, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0] |
58
- | 1.2898 | 40.0 | 800 | 3.2263 | 0.0552 | 0.1117 | 0.3476 | [0.3202981498179875, 0.2523853763872847, 0.0, 0.2830716149865086, 0.8220650746811913, 0.33176841169245147, 0.0, 0.01711185745524551, 0.22195317764572742, 0.0, 0.0, nan, 0.0, nan, 0.0, 2.2602956466705847e-05, 0.0, 0.0, 0.0, 0.0, 0.7303736186117692, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0] | [0.6443689991034742, 0.2625831715835974, nan, 0.6258581907090465, 0.9437529847182426, 0.47031300074778337, 0.0, 0.8571802935010482, 0.5633859298785479, nan, 0.0, nan, 0.0, nan, 0.0, 2.4867580135776987e-05, nan, 0.0, 0.0, 0.0, 0.9937930482139997, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0] |
59
- | 0.9923 | 50.0 | 1000 | 3.1876 | 0.0552 | 0.1069 | 0.3425 | [0.3214630828545835, 0.20601478596854178, 0.0, 0.27354650591173274, 0.8417970372421432, 0.31385557540188824, 0.0, 0.01546660166853167, 0.22122126594808622, 0.0, 0.0, nan, 0.0, nan, 0.0, 2.2945779123930152e-05, 0.0, 0.0, 0.0, 0.0, 0.7301226048014096, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0] | [0.6886871880793561, 0.21227660902890075, nan, 0.6349079054604727, 0.9073991165234002, 0.4336075205640423, 0.0, 0.7316561844863732, 0.526753854715684, nan, 0.0, nan, 0.0, nan, 0.0, 2.4867580135776987e-05, nan, 0.0, 0.0, 0.0, 0.9956150888995675, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0] |
60
-
61
-
62
  ### Framework versions
63
 
64
- - Transformers 4.29.2
65
- - Pytorch 2.0.1+cu118
66
- - Datasets 2.12.0
67
- - Tokenizers 0.13.3
 
1
  ---
2
  license: other
3
+ base_model: nvidia/mit-b0
4
  tags:
5
  - generated_from_trainer
6
  datasets:
 
17
 
18
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the scene_parse_150 dataset.
19
  It achieves the following results on the evaluation set:
20
+ - eval_loss: 3.1572
21
+ - eval_mean_iou: 0.0576
22
+ - eval_mean_accuracy: 0.1024
23
+ - eval_overall_accuracy: 0.4494
24
+ - eval_per_category_iou: [0.4122169623263367, 0.3410702102532667, 0.5602481859291197, 0.4322339707691856, 0.6822868377772705, 0.3710882712402531, 0.035435037700556504, 0.16723774313303955, 0.09070640694672612, 0.0, 0.06035674019253563, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.014934434191355027, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan]
25
+ - eval_per_category_accuracy: [0.7521353846678954, 0.6404357015119493, 0.5938931808300726, 0.7152574899548504, 0.9285709756576986, 0.6115768819122369, 0.38490661282180716, 0.3173203627625765, 0.18718022824380376, nan, 0.07594816986664145, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.015227012472532574, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan]
26
+ - eval_runtime: 26.7781
27
+ - eval_samples_per_second: 0.373
28
+ - eval_steps_per_second: 0.187
29
+ - epoch: 20.0
30
+ - step: 400
31
 
32
  ## Model description
33
 
 
54
  - lr_scheduler_type: linear
55
  - num_epochs: 50
56
 
 
 
 
 
 
 
 
 
 
 
 
57
  ### Framework versions
58
 
59
+ - Transformers 4.35.0
60
+ - Pytorch 2.1.0+cu118
61
+ - Datasets 2.14.6
62
+ - Tokenizers 0.14.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:6acff6c8858fd5aec21528a36bc1bb057b3d2466ebfb49bbf3d052e2820f2ea2
3
  size 15036944
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1ef54022386cb7e37dd857c7cec85f4f17feeabd0c0e908d5186bf95c3b0c783
3
  size 15036944
runs/Nov03_12-23-02_48a07fe659d4/events.out.tfevents.1699014194.48a07fe659d4.634.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:5a44f69ab1058c0c56c296907b8ae8d1dcda8382e47407fe2b5af8abc53998de
3
- size 73843
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d6b78284a7404f93a1afe5fa2533c27871fb517715a177c3272b34ccc5e01c5f
3
+ size 86089