youngking0727
commited on
Commit
•
3e11e9b
1
Parent(s):
1db5200
Update README.md
Browse files
README.md
CHANGED
@@ -24,7 +24,9 @@ The model was trained with the following hyperparameters:
|
|
24 |
* Cutoff length: 2048
|
25 |
* Learning rate: 2e-5
|
26 |
|
27 |
-
Overview
|
|
|
|
|
28 |
|
29 |
### Model Developers
|
30 |
PharMolix
|
|
|
24 |
* Cutoff length: 2048
|
25 |
* Learning rate: 2e-5
|
26 |
|
27 |
+
Overview Lla was finetuned on over 26 billion tokens highly pertinent to the field of biomedicine. The fine-tuning data are extracted from 5.5 million biomedical papers in S2ORC data using PubMed Central
|
28 |
+
(PMC)-ID and PubMed ID as criteria.
|
29 |
+
|
30 |
|
31 |
### Model Developers
|
32 |
PharMolix
|