pszemraj commited on
Commit
9dbaae1
1 Parent(s): 5a24193

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -9
README.md CHANGED
@@ -25,7 +25,7 @@ model-index:
25
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
26
  should probably proofread and complete it, then remove this comment. -->
27
 
28
- # long-t5-tglobal-xl-16384-book-summary-scientific_lay_summarisation-plos-norm-16384-summ-v1
29
 
30
  This model is a fine-tuned version of [pszemraj/long-t5-tglobal-xl-16384-book-summary](https://huggingface.co/pszemraj/long-t5-tglobal-xl-16384-book-summary) on the pszemraj/scientific_lay_summarisation-plos-norm dataset.
31
  It achieves the following results on the evaluation set:
@@ -38,7 +38,7 @@ It achieves the following results on the evaluation set:
38
 
39
  ## Model description
40
 
41
- More information needed
42
 
43
  ## Intended uses & limitations
44
 
@@ -72,10 +72,3 @@ The following hyperparameters were used during training:
72
  | 1.9307 | 0.56 | 700 | 1.5102 | 44.1634 | 10.9336 | 22.3896 | 40.2939 | 253.58 |
73
  | 1.2981 | 0.84 | 1050 | 1.5046 | 44.2728 | 10.8455 | 22.4122 | 40.3019 | 261.29 |
74
 
75
-
76
- ### Framework versions
77
-
78
- - Transformers 4.29.2
79
- - Pytorch 2.0.1+cu118
80
- - Datasets 2.12.0
81
- - Tokenizers 0.13.3
 
25
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
26
  should probably proofread and complete it, then remove this comment. -->
27
 
28
+ # long-t5-tglobal-xl-16384-booksci-summary-plos-10k
29
 
30
  This model is a fine-tuned version of [pszemraj/long-t5-tglobal-xl-16384-book-summary](https://huggingface.co/pszemraj/long-t5-tglobal-xl-16384-book-summary) on the pszemraj/scientific_lay_summarisation-plos-norm dataset.
31
  It achieves the following results on the evaluation set:
 
38
 
39
  ## Model description
40
 
41
+ Another test of further fine-tuning booksum-based models, this one fine-tuned on the PLOS subset of lay-summaries for about 10k examples input, to make it roughly equivalent to [this checkpoint](https://huggingface.co/pszemraj/long-t5-tglobal-xl-16384-booksci-summary-v1) fine-tuned on the ELIFE subset for two epochs (also around 10k examples).
42
 
43
  ## Intended uses & limitations
44
 
 
72
  | 1.9307 | 0.56 | 700 | 1.5102 | 44.1634 | 10.9336 | 22.3896 | 40.2939 | 253.58 |
73
  | 1.2981 | 0.84 | 1050 | 1.5046 | 44.2728 | 10.8455 | 22.4122 | 40.3019 | 261.29 |
74