ZJU-Fangyin commited on
Commit
6ffb3a2
1 Parent(s): 3584ca4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -16,7 +16,7 @@ With a training corpus of over 100 million molecules in SELFIES representation,
16
  Specifically, MolGen employs a bidirectional Transformer as its encoder and an autoregressive Transformer as its decoder.
17
  Through its carefully designed multi-task molecular prefix tuning (MPT), MolGen can generate molecules with desired properties, making it a valuable tool for molecular optimization.
18
 
19
- ![image.png](./pretrain.pdf)
20
 
21
  ## Intended uses
22
  You can use the raw model for molecule generation or fine-tune it to a downstream task. Please take note that the following examples only demonstrate the utilization of our pre-trained model for molecule generation. See the [repository](https://github.com/zjunlp/MolGen) to look for fine-tune details on a task that interests you.
 
16
  Specifically, MolGen employs a bidirectional Transformer as its encoder and an autoregressive Transformer as its decoder.
17
  Through its carefully designed multi-task molecular prefix tuning (MPT), MolGen can generate molecules with desired properties, making it a valuable tool for molecular optimization.
18
 
19
+ ![image.png](./model.png)
20
 
21
  ## Intended uses
22
  You can use the raw model for molecule generation or fine-tune it to a downstream task. Please take note that the following examples only demonstrate the utilization of our pre-trained model for molecule generation. See the [repository](https://github.com/zjunlp/MolGen) to look for fine-tune details on a task that interests you.