yurakuratov
commited on
Commit
•
416f055
1
Parent(s):
e158c2b
docs: update usage example
Browse files
README.md
CHANGED
@@ -20,14 +20,47 @@ Source code and data: https://github.com/AIRI-Institute/GENA_LM
|
|
20 |
Paper: https://www.biorxiv.org/content/10.1101/2023.06.12.544594v1
|
21 |
|
22 |
## Examples
|
23 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
24 |
```python
|
25 |
-
from src.gena_lm.modeling_bert import BertForSequenceClassification
|
26 |
from transformers import AutoTokenizer
|
27 |
|
28 |
tokenizer = AutoTokenizer.from_pretrained('AIRI-Institute/gena-lm-bert-base')
|
29 |
model = BertForSequenceClassification.from_pretrained('AIRI-Institute/gena-lm-bert-base')
|
30 |
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
31 |
|
32 |
## Model description
|
33 |
GENA-LM (`gena-lm-bert-base`) model is trained in a masked language model (MLM) fashion, following the methods proposed in the BigBird paper by masking 15% of tokens. Model config for `gena-lm-bert-base` is similar to the bert-base:
|
|
|
20 |
Paper: https://www.biorxiv.org/content/10.1101/2023.06.12.544594v1
|
21 |
|
22 |
## Examples
|
23 |
+
|
24 |
+
### How to load pre-trained model for Masked Language Modeling
|
25 |
+
```python
|
26 |
+
from transformers import AutoTokenizer, AutoModel
|
27 |
+
|
28 |
+
tokenizer = AutoTokenizer.from_pretrained('AIRI-Institute/gena-lm-bert-base')
|
29 |
+
model = AutoModel.from_pretrained('AIRI-Institute/gena-lm-bert-base', trust_remote_code=True)
|
30 |
+
|
31 |
+
```
|
32 |
+
|
33 |
+
### How to load pre-trained model to fine-tune it on classification task
|
34 |
+
Get model class from GENA-LM repository:
|
35 |
+
```bash
|
36 |
+
git clone https://github.com/AIRI-Institute/GENA_LM.git
|
37 |
+
```
|
38 |
+
|
39 |
```python
|
40 |
+
from GENA_LM.src.gena_lm.modeling_bert import BertForSequenceClassification
|
41 |
from transformers import AutoTokenizer
|
42 |
|
43 |
tokenizer = AutoTokenizer.from_pretrained('AIRI-Institute/gena-lm-bert-base')
|
44 |
model = BertForSequenceClassification.from_pretrained('AIRI-Institute/gena-lm-bert-base')
|
45 |
```
|
46 |
+
or you can just download [modeling_bert.py](https://github.com/AIRI-Institute/GENA_LM/tree/main/src/gena_lm) and put it close to your code.
|
47 |
+
|
48 |
+
OR you can get model class from HuggingFace AutoModel:
|
49 |
+
```python
|
50 |
+
from transformers import AutoTokenizer, AutoModel
|
51 |
+
model = AutoModel.from_pretrained('AIRI-Institute/gena-lm-bert-base', trust_remote_code=True)
|
52 |
+
gena_module_name = model.__class__.__module__
|
53 |
+
print(gena_module_name)
|
54 |
+
import importlib
|
55 |
+
# available class names:
|
56 |
+
# - BertModel, BertForPreTraining, BertForMaskedLM, BertForNextSentencePrediction,
|
57 |
+
# - BertForSequenceClassification, BertForMultipleChoice, BertForTokenClassification,
|
58 |
+
# - BertForQuestionAnswering
|
59 |
+
# check https://huggingface.co/docs/transformers/model_doc/bert
|
60 |
+
cls = getattr(importlib.import_module(gena_module_name), 'BertForSequenceClassification')
|
61 |
+
print(cls)
|
62 |
+
model = cls.from_pretrained('AIRI-Institute/gena-lm-bert-base', num_labels=2)
|
63 |
+
```
|
64 |
|
65 |
## Model description
|
66 |
GENA-LM (`gena-lm-bert-base`) model is trained in a masked language model (MLM) fashion, following the methods proposed in the BigBird paper by masking 15% of tokens. Model config for `gena-lm-bert-base` is similar to the bert-base:
|