lgq12697 commited on
Commit
9820434
1 Parent(s): 7408c87

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -3
README.md CHANGED
@@ -1,11 +1,18 @@
1
  ---
2
  license: cc-by-nc-sa-4.0
3
  widget:
4
- - text: ATGCTTTGTGCTGGCATGCCATGTCATGTTGCATCAGCATTTTCTTTATATTTTCTTTCTGATCTTTTCTGTGCTTCAAAACCTCATTCGTCTGTTTCCTTCTTTCCTACCAGTTATCCACAGACACACCCTATTAGAGTACTCCATGCTTGTTTATTTCTTTTGTCAAATAGAAGGGTCTTTTCTCCTCGCTTTAGTAGGGAATGTTGTCTTCCTCATTTGGGAAAAAAAAATTGTTCCTGCAGTTATGCCAGTCATGGGCTCTTTTTGATTGGTTGCATTGATATATTGTCTACCCCGTTTTCTGTAGGAATGATACATATTCCTGATCCTGAGCCTATTTGA
 
5
  tags:
6
  - DNA
7
  - biology
8
  - genomics
 
 
 
 
 
 
9
  ---
10
  # Plant foundation DNA large language models
11
 
@@ -38,7 +45,7 @@ Here is a simple code for inference:
38
  ```python
39
  from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline
40
 
41
- model_name = 'plant-nucleotide-transformer-lncRNAs'
42
  # load model and tokenizer
43
  model = AutoModelForSequenceClassification.from_pretrained(f'zhangtaolab/{model_name}', trust_remote_code=True)
44
  tokenizer = AutoTokenizer.from_pretrained(f'zhangtaolab/{model_name}', trust_remote_code=True)
@@ -60,4 +67,4 @@ Detailed training procedure can be found in our manuscript.
60
 
61
 
62
  #### Hardware
63
- Model was trained on a NVIDIA GTX1080Ti GPU (11 GB).
 
1
  ---
2
  license: cc-by-nc-sa-4.0
3
  widget:
4
+ - text: >-
5
+ ATGCTTTGTGCTGGCATGCCATGTCATGTTGCATCAGCATTTTCTTTATATTTTCTTTCTGATCTTTTCTGTGCTTCAAAACCTCATTCGTCTGTTTCCTTCTTTCCTACCAGTTATCCACAGACACACCCTATTAGAGTACTCCATGCTTGTTTATTTCTTTTGTCAAATAGAAGGGTCTTTTCTCCTCGCTTTAGTAGGGAATGTTGTCTTCCTCATTTGGGAAAAAAAAATTGTTCCTGCAGTTATGCCAGTCATGGGCTCTTTTTGATTGGTTGCATTGATATATTGTCTACCCCGTTTTCTGTAGGAATGATACATATTCCTGATCCTGAGCCTATTTGA
6
  tags:
7
  - DNA
8
  - biology
9
  - genomics
10
+ datasets:
11
+ - zhangtaolab/plant-multi-species-lncRNAs
12
+ metrics:
13
+ - accuracy
14
+ base_model:
15
+ - zhangtaolab/plant-nucleotide-transformer-BPE
16
  ---
17
  # Plant foundation DNA large language models
18
 
 
45
  ```python
46
  from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline
47
 
48
+ model_name = 'plant-nucleotide-transformer-BPE-lncRNAs'
49
  # load model and tokenizer
50
  model = AutoModelForSequenceClassification.from_pretrained(f'zhangtaolab/{model_name}', trust_remote_code=True)
51
  tokenizer = AutoTokenizer.from_pretrained(f'zhangtaolab/{model_name}', trust_remote_code=True)
 
67
 
68
 
69
  #### Hardware
70
+ Model was trained on a NVIDIA GTX1080Ti GPU (11 GB).